![]() master finger tracking device and a method of use in a minimally invasive surgical system
专利摘要:
MASTER FINGER TRACKING DEVICE AND A METHOD OF USE IN A MINIMUM INVASIVE SURGICAL SYSTEM. The present invention relates to a minimally invasive surgical system, a hand tracking system tracks the location of a sensor element mounted on part of a human hand. A system control parameter is generated based on the location of the human hand part. The operation of the minimally invasive surgical system is controlled using the system control parameter. Thus, the minimally invasive surgical system includes a hand tracking system. The hand tracking system tracks a location of part of a human hand. A controller attached to the hand tracking system converts the location to a system control parameter, and injects a command based on the system control parameter into the minimally invasive surgical system. 公开号:BR112012011277B1 申请号:R112012011277-5 申请日:2010-11-11 公开日:2020-10-13 发明作者:Brandon D. Itkowitz;Simon Dimaio;Karlin Y. Bark 申请人:Intuitive Surgical Operations, Inc; IPC主号:
专利说明:
CROSS REFERENCE TO RELATED ORDER [0001] This application is partly a continuation of United States Patent Application No. 12 / 617,937 (filed November 13, 2009, revealing "Patient-Side Surgeon Interface for Minimally Invasive Teleoperated Surgical Instrument"), which is incorporated into this document by reference in its entirety. Background Field of the Invention [0002] Aspects of this invention are related to the control of minimally invasive surgical systems, and are more particularly related to using the movement of a surgeon's hand to control a minimally invasive surgical system. Related Technique [0003] Method and techniques for tracking hand positions and gestures are known. For example, some video game controllers use hand tracking input. For example, the Nintendo Wii gaming platform supports wireless positioning and orientation perception remote controls. (Wii is a registered trademark of Nintendo of America Inc. Redmond Washington, U.S.A.). The use of gestures and other physical movements like swinging a club or waving a magic wand provide the fundamental game element for this platform. The Sony Playstation® Move has attributes similar to those of the Nintendo Wii® gaming platform. [0004] A wireless CyberGlove® motion data capture glove from CyberGlove® Systems includes eighteen data sensors with two curve sensors on each finger, four abduction sensors and sensors measuring thumb crossing, palm arch, flexion of the ammunition, and ammunition abduction. (CyberGlove® is a registered trademark of CyberGlove® Systems LLC of San José, CA). When a three-dimensional tracking system is used with the CyberGlove® motion data capture glove, x, y, z, yaw, tilt, rotation position, and hand guidance information are available. The motion capture system for the CyberGlove® motion data capture glove has been used in the evaluation of digital prototypes, biomechanics in virtual reality, and animation. [0005] Another data glove with forty sensors is the ShapeHand data glove from Measurand Inc. A portable, lightweight ShapeClaw hand motion capture system from Measurand Inc. includes a system of flexible tapes that capture finger movement. index and thumb together with position and orientation of hand and forearm in space. [0006] In In-Cheol Kim and Sung-Il Chien, "Hand Gesture Analysis in 3D Using Composite Hidden Path Markov Models" Applied Intelligence, Vol. 15 No. 2, p.131- 143, September - October 2001, Kim and Chien explore the use of three-dimensional trajectory input with a Polhemus sensor for gesture recognition. Kim and Chien propose this form of entry because trajectories in three dimensions offer more power of discrimination than gestures in two dimensions, which are used predominantly in video-based approaches. For their experiments, Kim and Chien make use of a Polhemus magnetic position tracking sensor attached to the back of a Fakespace PinchGlove glove. PinchGlove provides a means for the user to signal the beginning of the end of a gesture while the Polhemus sensor captures the trajectory in three dimensions of the user's hand. [0007] In Elena Sanchez-Nielsen, et al, "Hand Gesture Recognition for Human-Machine Interaction", Journal of WSCG, Vol. 12, No. 1-3, ISSN 1213-6972, WSCG'2004, 2- February 6, 2003, Plzen, Czech Republic, a real-time vision system is proposed for application within visual interaction environments through hand gesture recognition using general purpose equipment and low cost sensors, such as personal computer and web camera. In Pragati Garg, et al., "Vision Based Hand Gesture Recognition" 49 World Academy of Science, Engineering and Technology, 972-977 (2009), a review of vision based hand gesture recognition was presented. One conclusion presented was that most approaches are based on several underlying assumptions that may be appropriate in a controlled laboratory environment, however, they do not generalize to arbitrary environments. The authors declare "Computer vision methods for hand gesture interfaces must exceed current performance in terms of robustness and speed to achieve interactivity and practical application". In the medical field, gesture recognition has been considered for sterile reading of radiology images. See Juan P. Wachs, et al., "A Gesture-Based Tool for Sterile Reading of Radiology Images", Journal of the American Medical Informatics Association (2008; 15: 321-323, DOI 10.1197 / jamia.M24). summary [0008] In one aspect, a hand tracking system in a minimally invasive surgical system tracks the location of part of a human hand. A control parameter of the minimally invasive surgical system is generated based on the location of the part of the human hand. The operation of the minimally invasive surgical system is controlled using the system control parameter. [0009] In one aspect, sensor elements mounted on part of a human hand are tracked to obtain locations on the part of the human hand. A position and orientation of a control point are generated based on the location. Teleoperation of a device in a minimally invasive surgical system is controlled based on the position of the control and orientation point. In one aspect, the device is a teleoperated (operated remotely) slave surgical instrument. In another aspect, the device is a virtual substitute presented in a video image of a surgical page. Examples of a virtual substitute include a virtual slave surgical instrument, a virtual hand, and a virtual telestration device (remote surgery illustration technique). [0010] In an additional aspect, a handle closure parameter is generated in addition to the position and orientation of the control point. The handle of an extreme manipulator of the teleoperated slave surgical instrument is controlled based on the handle closing parameter. [0011] In another aspect, the system control parameter is a position and orientation of a control point used for teleoperating the slave surgical instrument. In yet another aspect, the system control parameter is determined from two hands. The control system parameter is in a position and orientation from a control point to one of the two hands, and a position and orientation from a control point to the other of the two hands. The control points are used for teleoperating an endoscopic camera manipulator in the minimally invasive surgical system. [0012] In one aspect, sensor elements mounted on part of a human hand are tracked to obtain locations of the part of the human hand. A position and orientation for a second control point are generated based on the location of the second human hand part. In this regard, both the control point and the second control point are used to control teleoperation. [0013] In yet another aspect, sensor elements mounted on the fingers of a human hand are tracked. A movement between the fingers is determined, and the orientation of a teleoperated slave surgical instrument in a minimally invasive surgical system is controlled based on the movement. [0014] When the movement is a first movement, control includes rotating the end of a wrist of a slave surgical instrument around the direction in which it points. When the movement is a second movement different from the first movement, the control includes movement of yawing the wrist of the slave surgical instrument. [0015] A minimally invasive surgical system includes a hand tracking system and a controller attached to the hand tracking system. The hand tracking system tracks locations of a large number of sensor elements mounted on part of a human hand. The controller transforms locations to a position and orientation from a control point. The controller sends a command to move a device in the minimally invasive surgical system based on the control point. Again, in one aspect, the device is a teleoperated slave surgical instrument, while in another aspect, the device is a virtual replacement presented in a video image of a surgical page. [0016] In one aspect, the system also includes a master finger tracking device including the large number of tracking sensors. The master finger tracking device further includes a compressible body, a first finger loop attached to the compressible body, and a second finger loop attached to the compressible body. A first tracking sensor in the large number of tracking sensors is connected to the first finger circuit. A second tracking sensor in the large number of tracking sensors is connected to the second finger circuit. [0017] Therefore, in one aspect, a minimally invasive surgical system includes a master finger tracking device. The master finger tracking device includes a compressible body, a first finger loop attached to the compressible body, and a second finger loop attached to the compressible body. A first tracking sensor is connected to the first finger circuit. A second tracking sensor is connected to the second finger circuit. [0018] The compressible body includes a first end, a second end, and an external outer surface. The outer outer surface includes a first part extending between the first and second end, and a second part, opposite and removed from the first part, extending between the first and second end. [0019] The compressible body also has a length. The length is selected to limit a separation between a first finger and a second finger on the human hand. [0020] The first loop of the finger is attached to the compressible body adjacent to the first end and extends approximately the first part of the outer outer surface. When placing the first finger loop on a first finger of a human hand, a first part of the first section of the outer outer surface contacts the first finger. [0021] The second loop of the finger is attached to the compressible body adjacent to the second end and extends approximately the first part of the outer outer surface. When placing the second finger loop on a second finger of the human hand, a second part of the first section of the outer outer surface contacts the second finger. When the first and second fingers move towards each other, the compressible body is positioned between the two fingers so that the compressible body provides resistance to movement. [0022] A thickness of the compressible body is selected so that when a tip of the first finger only touches a tip of the second finger, the compressible body is smaller than fully compressed. The compressible body is configured to provide tactile feedback corresponding to a holding force and a handler on the end of a teleoperated slave surgical instrument. [0023] In one aspect, the first and second tracking sensors are passive electromagnetic sensors. In an additional aspect, each passive electromagnetic tracking sensor has six degrees of freedom. [0024] One method of using the master finger tracking device includes a first location of a sensor mounted on a first finger of a human hand and a second location of another sensor mounted on a second finger. Each location has N degrees of freedom, where N is an integer greater than zero. The first location and the second location are mapped to a checkpoint location. The control point location has six degrees of freedom. The six degrees of freedom are less than or equal to 2 * N degrees of freedom. The first location and the second location are also mapped to a parameter having a single degree of freedom. Teleoperation of a slave surgical instrument in a minimally invasive surgical system is controlled based on the position of the control point and the parameter. [0025] In a first aspect, the parameter is a closing distance of the handle. In a second aspect, the parameter comprises an orientation. In another aspect, N is six, while in a different aspect, N is five. [0026] Still in an additional aspect, sensor elements mounted on part of a human hand are tracked to obtain a large number of locations on the part of the human hand. A hand gesture from a large number of known hand gestures is selected based on the large number of locations. The operation of the minimally invasive surgical system is controlled based on the gesture of the hand. [0027] The hand gesture can be any of a hand gesture position, a hand gesture trajectory, or a combination of hand gesture position and hand gesture trajectory. When the hand gesture is a hand gesture position and the large number of known hand gestures includes a large number of known hand gesture positions, a minimally invasive surgical system user interface is controlled based on the position of the hand gesture. hand. [0028] Furthermore, in one aspect, when the hand gesture is the position of the hand gesture, the selection of the hand gesture includes the generation of a set of characteristics observed from the large number of locations tracked. The set of characteristics observed is compared with sets of characteristics of the large number of known hand gesture positions. One of the known hand gestures is selected as the position of the hand gesture. The selected hand gesture position is mapped to a system command, and the system command is triggered on the minimally invasive surgical system. [0029] Still in an additional aspect, when the hand gesture includes a trajectory of hand gesture, the user interface of the minimally invasive surgical system is controlled based on the trajectory of the hand gesture. [0030] In the minimally invasive surgical system with the hand tracking system and the controller, the controller transforms the locations for a hand gesture. The controller sends a command to modify a minimally invasive operating system operation mode based on the hand gesture. [0031] In yet another aspect, a sensor element mounted on part of a human hand is tracked to obtain a location of the part of the human hand. Based on location, the method determines whether a position of the human hand part is within a threshold distance from a handle position of a master tool in a minimally invasive surgical system. The operation of the minimally invasive surgical system is controlled based on a result of the determination. In one aspect, the teleoperation of a teleoperated slave surgical instrument coupled to the handle of the master tool is controlled based on a result of the determination. In another aspect, display of a user interface, or display of a visual substitute is controlled based on the result of the determination. [0032] In one aspect, the position of the human hand part is specified by a position of the control point. In another aspect, the position of the part of the human hand is a position of an index finger. [0033] A minimally invasive surgical system includes a hand tracking system. The hand tracking system tracks a location of part of a human hand. A controller uses location to determine if a surgeon's hand is close enough to a master tool handle to allow for a particular operation of the minimally invasive surgical system. [0034] A minimally invasive surgical system also includes a controller attached to the hand tracking system. The controller converts the location to a system control parameter, and injects a command based on the system control parameter into the minimally invasive surgical system. BRIEF DESCRIPTION OF THE DRAWINGS [0035] Figure 1 is a diagrammatic view at high level of a teleoperated minimally invasive surgical system including a hand tracking system. [0036] Figures 2A to 2G are examples of various configurations of a hand-tracked master tool handle used to control a teleoperated slave surgical instrument from the teleoperated minimally invasive surgical system of figure 1. [0037] Figures 3A to 3D are examples of hand gesture positions used to control system modes in the teleoperated minimally invasive surgical system of figure 1. [0038] Figures 4A to 4C are examples of hand gesture trajectories that are also used to control system modes in the teleoperated minimally invasive surgical system of figure 1. [0039] Figure 5 is an illustration of placing comparison standard markers for hand tracking in a camera based tracking system. [0040] Figures 6A and 6B are more detailed diagrams of the surgeon's console in figure 1, and include examples of coordinate systems used in manual tracking by the minimally invasive teleoperated surgical system in figure 1. [0041] Figure 7 is a more detailed illustration of a hand-wearing master tool handle and the locations and coordinate systems used in manual tracking by the teleoperated minimally invasive surgical system in Figure 1. [0042] Figure 8 is a process flow diagram of a process used in the tracking system to track fingers and used to generate data for teleoperation of a slave surgical instrument in the minimally invasive teleoperated surgical system figure 1. [0043] Figure 9 is a more detailed process flow diagram of the process LOCATION DATA ON THE MAP FOR HANDLE CONTROL POINT AND PARAMETER in figure 8. [0044] Figure 10 is a process flow diagram of a process used in the tracking system to recognize hand gesture postures and hand gesture trajectories. [0045] Figure 11 is a process flow diagram of a process used in the tracking system to detect the presence of the hand. [0046] Figure 12 is an illustration of an example of a master finger tracking device. [0047] Figure 13 is an illustration of a video image, presented on a visual presentation device, including a substitute visual, which in this example includes a virtual phantom instrument, and a teleoperated slave surgical instrument. [0048] Figure 14 is an illustration of a video image, presented on a visual display device, which is an example that includes a pair of virtual hands, and teleoperated slave surgical instruments. [0049] Figure 15 is an illustration of a video image, presented on a visual display device, including substitute visuals, which in this example includes a virtual telestration device (illustration technique in remote surgery) and a virtual ghost instrument , and teleoperated slave surgical instruments. [0050] In the drawings, the first digit of a three-digit reference number indicates the figure number in which the element with that reference number first appeared and the first two digits of a four-digit reference number indicate the figure number of the figure in which the element with that reference number appeared first. DETAILED DESCRIPTION [0051] As used in this document, a location includes a position and an orientation. [0052] As used in this document, a hand gesture, sometimes called a gesture, includes a hand gesture position, a hand gesture path, and a combination of hand gesture position and hand gesture path. [0053] Aspects of this invention increase the control capacity of minimally invasive surgical system systems, for example, the da Vinci® teleoperated minimally invasive surgical system marketed by Intuitive Surgical, Inc. of Sunnyvale, California, by using information from location of the hand in the control of the minimally invasive surgical system. A measured location of one or more fingers is used to determine a system control parameter that is used to activate a system command in the surgical system. System commands depend on the location of the person whose hand location is being tracked, that is, if the person is on a surgeon's console. [0054] When the measured locations are for fingers of a person's hand not on the surgeon's console, the system commands include a command to change the orientation of a part of a teleoperated slave surgical instrument based on a combination of hand guidance and relative movement of two fingers of a hand, and a command to move a tip of a teleoperated slave surgical instrument so that the movement of the tip follows the movement of a part of the hand. When the measured locations are for the fingers of a person's hand on the surgeon's console, the system commands include commands allowing or preventing the movement of a slave surgical instrument to continue to follow the movement of the handle of a master tool. When measured locations are for fingers of a person's hand not on the surgeon's console, or for digits of a person's hand on a surgeon's console, system commands include commanding the system, or a part of the system to take an action based on a hand gesture position, and command the system or part of the system to take an action based on the trajectory of the hand gesture. [0055] Figure 1 is a diagrammatic high-level view of a teleoperated minimally invasive surgical system 100, for example, the da Vinci® Surgical System, including a hand tracking system. There are other parts, cables, etc. associated with the da Vinci® Surgical System, however, these are not illustrated in Figure 1 to avoid distorting the disclosure. Additional information regarding minimally invasive surgical systems can be found, for example, in United States Patent Applications No. 11 / 762,165 (filed June 13, 2007, revealing "Minimally Invasive Surgical System"), and United States Patent United States No. 6,331,181 (issued December 18, 2001, revealing "Robotic Surgical Tools, Data Architecture and Usage"), both of which are incorporated into this document by reference. See also, for example, United States Patents No. 7,155,315 (filed December 12, 2005; revealing "Control Referenced by Camera on a Minimally Invasive Surgical Apparatus") and 7,574,250 (filed February 4, 2005) 2003, revealing "Image Transfer Device and Method for a Telerobotic System"), both of which are incorporated into this document by reference. [0056] In this example, system 100 includes a car 110 with a large number of manipulators. [0057] Each manipulator and the teleoperated slave surgical instrument controlled by that manipulator can be joined and separated from the manipulators of the master tool on the surgeon's console 185, and in addition, they can be joined and separated from the tracking handle 170 of the master finger mechanically unsupported and unpowered, sometimes called the tracking finger 170 of the master finger. [0058] A stereoscopic endoscope 112 mounted on the manipulator 113 provides an image of the surgery site 103 within the patient 111 which is displayed on screen 187 and on the screen on the surgeon console 185. The image includes images of any of the slave surgical devices in the field of view of the stereoscopic endoscope 112. The interactions between the master tool manipulators on the surgeon's console 185, the slave surgical devices and the stereoscopic endoscope 112 are the same as in a known system and are thus known to those skilled in the art. [0059] In one aspect, the surgeon 181 moves at least one finger of the surgeon's hand, which in turn makes a sensor on the tracking handle 170 of the master finger change location. The hand tracking transmitter 175 provides a field so that the new position and orientation of the finger are perceived by the tracking handle 170 of the master finger. The new perceived position and guidance is provided to the hand tracking controller 130. [0060] In one aspect, as explained more fully below, the hand tracking controller 130 maps the perceived position and orientation to a checkpoint position and a checkpoint orientation in a surgeon 181 eye coordinate system. The hand tracking controller 130 sends this location information to the controller system 140 which in turn sends a system command to the teleoperated slave surgical instrument attached to the tracking finger 170 of the master finger. As explained more fully below, using the tracking handle 170 of the master finger, the surgeon 181 can control, for example, the handle of an extreme manipulator of the teleoperated slave surgical instrument, as well as the rotation and yaw of a wrist attached to the manipulator the extreme. [0061] In another aspect, tracking the hand of at least a part of the hand of the surgeon 181 or the hand of the surgeon 180 is used by the hand tracking controller 130 to determine whether a hand gesture position is made by the surgeon, or a combination of a hand gesture position and a hand gesture path is made by the surgeon. Each hand gesture position and each trajectory combined with a hand gesture position are mapped to a different system command. The system commands control, for example, changes in system mode and control other aspects of the 100% minimally invasive surgical system. [0062] For example, instead of using pedals and switches as in a known minimally invasive surgical system, a hand gesture, any hand gesture position or hand gesture path, is used (i) to start the next between movements of the handle of the master tool and the teleoperated slave surgical instrument, (ii) to activate the master clutch (which disconnects the master control from the slave instrument), (iii) to control the endoscopic camera (which allows the master to control the movement of the endoscope or features, such as focus or electronic zoom), (iv) to change the robotic arm (which changes a particular master control between two slave instruments), and (v) to change the TILEPRO ™, (which alternates the screen of auxiliary video windows on the surgeon's screen). (TILEPRO is a trademark of Intuitive Surgical, Inc. of Sunnyvale, CA, USA.) [0063] When there are only two master tool handles on the system 100 and the surgeon 180 needs to control the movement of a surgical instrument other than the two teleoperated slave surgical instruments instead of using a first hand gesture. The surgeon then associates one or both of the handles of the master tools with other slave surgical instruments held by other manipulating arms by using a different hand gesture which in this implementation provides an association of changing the handle of the master tool to another teleoperated slave surgical instrument. [0064] Surgeon 181 performs an equivalent procedure when there are only two master finger tracking handles on system 100. [0065] In yet another aspect, a hand tracking unit 186 mounted on the surgeon's console 185 tracks at least part of the surgeon's hand 180 and sends the perceived location information to the hand tracking controller 130. The hand tracking 130 determines when the surgeon's hand is close enough to the master tool handle to allow the system the following, for example, the movement of the slave surgical instrument that follows the movement of the master tool handle. As explained more fully below, in one aspect, the hand tracking controller 130 determines the position of the surgeon's hand and the position of the corresponding handle of the master tool. If the difference in the two positions is within a predetermined distance, for example, less than a threshold separation, the following is allowed, and is otherwise inhibited. In this way, the distance is used as a measure of the presence of the surgeon's hand with respect to the handle of the master tool on the surgeon's console 185. In another aspect, when the position of the surgeon's hand relative to the position of the handle of the master tool is less that the threshold separation, the display of a user interface on a monitor is inhibited, for example, turned off on a visual display device. Conversely, when the position of the surgeon's hand relative to the position of the handle of the master tool is greater than the threshold separation, the user interface is displayed on the visual display device, for example, by turning it on. [0066] Detecting the presence of the surgeon's hand has been an old problem. Presence detection has often been attempted using different contact perception technologies, such as capacitive switches, pressure sensors, and mechanical switches. However, these approaches are inherently problematic because surgeons have different preferences about how and where they hold the handle of the master tool. Using distance as a presence measure is advantageous because this type of presence detection allows the surgeon to lightly touch the handle of the master tool and then momentarily break the physical contact to adjust the handle of the master tool, however, this does not compel how the surgeon holds the handle of the master tool with his fingers. [0067] Control of the Surgical Instrument by Hand Tracking [0068] An example of a non-energized mechanically non-grounded master tracking handle 270, sometimes referred to as a 270 tracking finger master grip, is illustrated in figures 2A to 2D in different configurations which are more fully described below. Finger tracking master handle 270 includes sensors mounted on fingers 211, 212, sometimes referred to as sensors mounted on finger and thumb 211, 212, which independently track the location (position and orientation in an example) of each of a tip index finger 292B and a thumb tip 292A, that is, track the location of two fingers of the surgeon's hand. In this way, the location of the hand itself is tracked as opposed to tracking the location of the handholds of the master tool in a minimally invasive surgical system. [0069] In one aspect, the sensors provide tracking of six degrees of freedom (three of translation and three of rotation) for each finger of the hand on which the sensor is mounted. In another aspect, the sensors provide tracking of five degrees of freedom (three of translation and three of rotation) for each finger of the hand on which the sensor is mounted. [0070] In yet another aspect, the sensors provide tracking of three degrees of freedom (three of translation) for each finger of the hand on which a sensor is mounted. When two fingers are each tracked with three degrees of freedom, a total of six translational degrees of freedom are sufficient to control a slave surgical instrument that does not include a wrist mechanism. [0071] A padded foam connector 210 is connected to sensors mounted 211, 212 between finger and thumb. The connector 210 compels the thumb 292 A and the index finger 292B, that is, the fingers of the hand 291R, to be within a fixed distance, that is, there is a maximum separation distance between the fingers of the hand 291R in which the handle tracking finger 270 is mounted. As the thumb 292 A and index finger 292B are taken from the maximum separation (figure 2A) to a completely closed configuration (figure 2D), the padding provides positive feedback to assist surgeon 181 in controlling the grip force and a handler. end of a teleoperated slave surgical instrument connected to the master finger tracking handle 170. [0072] The position illustrated in figure 2A with the thumb 292 A and index finger 292B separated by the maximum distance allowed by the tracking foot of the master finger 270, the grip strength is a minimum. Conversely, in the position shown in figure 2D where the thumb 292A and index finger 292 are as close as allowed by connector 210, that is, separated by a minimum distance allowed by the tracking handle of the master finger 270, the grip strength is a maximum. Figures 2B and 2C represent positions that are mapped to intermediate grip forces. [0073] As explained more fully below, the locations (positions and orientations) of the thumb 292 A and index finger 292B in figures 2A to 2D are mapped to a handle closure parameter, for example, a normalized handle closure value that is used to control the grip of the teleoperated slave surgical instrument connected to the tracking handle of the master finger 270. Specifically, the perceived locations of the thumb 292 A and index finger 292B are mapped to the handle closure parameter by the hand tracking controller 130 . [0074] Thus, a location of part of the hand of the surgeon 181 is tracked. Based on the tracked location, a system control parameter for a minimally invasive surgical system 100, that is, a handle closure parameter, is generated by hand tracking controller 130, and fed to system controller 140. The controller System 140 uses the handle closure parameter when generating a system command that is sent to the teleoperated slave surgical instrument. The system command instructs the teleoperated surgical instrument to configure an end manipulator to have a handle closure corresponding to the handle closure parameter. Therefore, the minimally invasive surgical system 100 uses the handle closure parameter to control the operation of the teleoperated slave surgical instrument of the minimally invasive surgical system 100. [0075] Also, the locations (positions and orientations) of the thumb 292 A and index finger 292B in figures 2A to 2D are mapped to a position of the control point and an orientation of the control point by the hand tracking controller 130. A control point position and control point orientation are mapped into an eye coordinate system for the surgeon 181, and then supplied to the system controller 140 via a command signal. The position of the control point and the orientation of the control point in the eye coordinate system are used by the system controller 140 for teleoperating the slave surgical instrument connected to the tracking finger of the master finger 170. [0076] Again, a location of part of the surgeon's hand 181 is tracked. Based on the tracked location, another system control parameter of a minimally invasive surgical system 100, the position of the control and orientation point is generated by the hand tracking controller 130. The hand tracking controller 130 transmits a command signal with the position of the control and guidance point for the system controller 140. The system controller 140 uses the position of the control and guidance point in generating a system command that is sent to the teleoperated slave surgical instrument. The system command instructs the teleoperated surgical instrument to position the teleoperated surgical instrument based on the position of the control and orientation point. Therefore, the minimally invasive surgical system 100 uses the position of the control and orientation point to control the operation of the teleoperated slave surgical instrument of the minimally invasive surgical system 100. [0077] In addition to determining the grip closure based on the positions of sensors 211, 212, another relative movement between the index finger 292B and thumb 292 A is used to control the yaw movement and the turning movement of the slave surgical instrument . Rubbing the finger 292B and thumb 292 A together transversely as if rotating an axis, which is represented by the arrows (figure 2E) around an imaginary axis 293, produces the turning of the slave surgical instrument tip, while sliding the index finger and thumb up and back longitudinally along the other, which is represented by the arrows (figure 2F) next to an axis in the pointed direction represented by the arrow 295, produces a yaw movement along the X-axis of the slave surgical instrument. This is accomplished by mapping the vector between the positions of the tip of the index finger and the tip of the thumb to define the X-axis of the control point orientation. The position of the control point remains relatively stationary since the finger and thumb are each slid in a symmetrical manner along the 295 axis. While the movements of the finger and thumb are not completely symmetrical movements, the position still remains sufficiently stationary that the user can easily correct any disturbance that may occur. [0078] Again, locations of part of the 181 surgeon's hand are tracked. Based on the tracked locations, yet another system control parameter, that is, the relative movement between two fingers of the 291R surgeon's hand, is generated by the hand tracking controller 130. [0079] The hand tracking controller 130 converts the relative motion into an orientation for the teleoperated slave surgical instrument connected to the master finger tracking handle 170. The hand tracking controller 130 sends a command signal with guidance to the controller of the system 140. While this orientation is an absolute orientation, the controller of the mapping system 140, in one aspect, uses this input with slow movement during teleoperation in the same matter as an orientation input of any other passive handle cardan master tool. An example of slow movement is described in a commonly designated United States Patent Application No. 12 / 495,213 (filed June 30, 2009, revealing "Slow Movement for Master Alignment of a Teleoperated Surgical Instrument"), which is incorporated fully in this document by reference. [0080] The system controller 140 uses the guidance in generating a system command that is sent to the teleoperated slave surgical instrument. The system command instructs the teleoperated surgical instrument to rotate the teleoperated surgical instrument based on the orientation. Therefore, the minimally invasive surgical system 100 uses the movement between the two fingers to control the operation of the teleoperated slave surgical instrument of the minimally invasive surgical system 100. [0081] When the movement is a first movement, for example, transverse rubbing of the finger 292B and thumb 292A as rotating an axis, the orientation is a turn, and the command of the system results in a turn of the slave instrument's wrist tip along with your indicated direction. When the movement is a second movement different from the first movement, for example, sliding the index finger and thumb up and back along one another (figure 2F), the orientation is a turn, and the system command results in a twisting movement of the wrist of the slave surgical instrument. [0082] In yet another aspect, when the surgeon changes the system's mode of operation to a gesture recognition mode, both hands are tracked and control points and orientations for both hands are generated based on the positions and orientations perceived by the sensors mounted on hands in one aspect. For example, as shown in Figure 2G, the tips of the thumb and index finger of each hand are touched together to create a circle-like shape. The perceived position of each hand is mapped by the hand tracking controller 130 to a pair of checkpoint positions. The control point pair is sent with an event from the control system camera to the system controller 140. [0083] Thus, in this respect, a location of a part of each hand of the surgeon 181 is tracked. [0084] Another system control parameter of the minimally invasive surgical system 100, that is, the pair of control point positions, based on the tracked location is generated by the hand tracking controller 130. The hand tracking controller 130 sends the pair of checkpoint positions with the camera event from the control system to the system controller 140. [0085] In response to the control system camera event, system controller 140 generates a command from the control system camera based on the pair of control point positions. The control from the control system camera is sent to a teleoperated endoscopic camera manipulator in the minimally invasive surgical system 100. Therefore, the minimally invasive surgical system 100 uses the pair of control point positions to control the operation of the endoscopic camera manipulator. teleoperation of the minimally invasive surgical system 100. [0086] System control via Hand Gesture Positions and Hand Gesture Trajectories [0087] In this aspect, after being placed in a gesture detection operation mode, the hand tracking controller 130 detects a hand gesture position, or a hand gesture position and a hand gesture path. Controller 130 maps hand gesture positions to certain system mode control commands, and similarly controller 130 maps hand gesture paths to other system mode control commands. Note that the mapping of positions and trajectories is independent and so this is different from, for example, manual sign language tracking. The ability to generate system commands and to control system 100 using hand gesture positions and trajectories, in place of manipulation keys, on numerous pedals, etc. as known in minimally invasive surgical systems, it provides great ease of use of the 100 system to the surgeon. [0088] When a surgeon is standing, the use of hand gesture positions and hand gesture trajectories to control the system 100 makes it unnecessary for the surgeon to take the surgeon's eyes off the patient and / or look or the patient's screen. monitor and fetch a pedal or key when the surgeon wants to change the system mode. Finally, the elimination of the various keys and pedals reduces the useful area required for the minimally invasive teleoperated surgical system. [0089] The particular set of hand gesture positions and hand gesture trajectories used to control the minimally invasive surgical system 100 is not critical in that the hand gesture position and each hand gesture trajectory are unambiguous. Specifically, a hand gesture position should not be able to be interpreted by the hand tracking controller 130 as one or the other hand gesture position in the set of positions, and a hand gesture path should not be interpreted as more than a trajectory of the hand gesture in the set of trajectories. In this way, the hand gesture positions and hand gesture trajectories discussed below are illustrative only and are not intended to be limiting. [0090] Figures 3A to 3D are examples of hand gesture positions 300A to 300D, respectively. [0091] Figures 4A to 4C are examples of trajectories of the hand gesture. Note, for example, that the configuration in figure 2A appears similar to that in figure 3A, but the mode of operation of the minimally invasive surgical system 100 is different when the two configurations are used. [0092] In figure 2A, the teleoperated minimally invasive slave surgical instrument is connected to the master finger tracking handle 170 and the system 100 is in the following mode so that the movement of the teleoperated minimally invasive slave surgical instrument follows the tracked hand movement of the surgeon. In figures 3A to 3D and 4A to 4C, the surgeon puts the system 100 into gesture recognition mode, and then makes one of the illustrated hand gesture positions or hand gesture trajectories. Hand gesture positions and hand gesture trajectories are used to control system modes and are not used in the following mode of operation. For example, the system modes controlled with hand gesture positions are for enabling, disabling and switching between visual elements, for erasing the visual element, and for drawing / erasing the telestration (remote surgery illustration technique). [0093] In the position of the hand gesture 300 A (figure 3A), thumb 292 A and index finger 292 are separated beyond the threshold of the master clutch, for example, the distance between the two fingers of the hand 291R is greater than 115 mm . The position of the hand gesture 300B (figure 3B) with index finger 292B extended and thumb 292 A curled is used to signal to the hand tracking controller 130 that the surgeon is drawing a trajectory of the hand gesture (See figures 4A and 4B ). Hand gesture position 300C (figure 3C) with thumb 292 A above and curled index finger 292B is used to switch on a user interface and to switch between modes on the user interface. 300D hand gesture position (3D figure) with 292 A thumb down and curled 292B index finger is used to turn on the user interface. Other hand gesture positions may include an "OK" hand gesture position, an L-shaped hand gesture position. [0094] The hand tracking controller 130, in one aspect, uses a multidimensional feature to recognize and identify a hand gesture position. Initially, a large number of hand gesture positions are specified. Next, a set of attributes is specified that includes a large number of resources. The feature set is designed to uniquely identify each hand gesture position in the large number of positions. [0095] A hand gesture position recognition process is trained using a training database. The training database includes a large number of instances of each hand gesture position. The large number of instances includes attribute vectors for positions made by several different people. A set of attributes is generated for each of the instances in the training database. These sets of attributes are used for training a multidimensional Bayesian classifier, as explained more fully below. [0096] When the surgeon 180 needs it in the hand gesture operation mode, the surgeon activates a key, for example, presses a pedal, and then makes a hand gesture position with at least one hand. Note that while this example requires a simple pedal, it allows the elimination of the other pedals in the foot tray of the known minimally invasive surgical system and yet has the advantages described above. Hand tracking unit 186 sends signals representing the perceived positions and orientations of the thumb and index finger of the surgeon's hand or hands to the hand tracking controller 130. [0097] Using the tracking data for the surgeon's hand digits, the hand tracking controller 130 generates a set of observed features. The hand tracking controller 130 then uses the trained multidimensional Bayesian classifier and a Mahalanobis distance to determine the possibility, that is, probability, that the set of observed attributes is a set of attributes of a hand gesture position in the large number of positions. This is done for each of the hand gesture positions in the large number of poses. [0098] The position of the hand gesture in the large number of positions that is selected by the hand tracking controller 130 as the position of the observed hand gesture is that having the shortest Mahalanobis distance if the Mahalanobis distance is less than the Mahalanobis distance maximum in the training database for that hand gesture position. The selected hand gesture position is mapped to a system event. The hand tracking controller 130 injects the system event to the system controller 140. [0099] System controller 140 processes the system event and issues a system command. For example, if a hand gesture position 300C (figure 3C) is detected, system controller 140 sends a system command to the screen controller 150 to turn on the user interface. In response, the display controller 150 runs at least a portion of the user interface module 155 on processor 151 to generate a user interface on the screen of the surgeon console 185. [00100] Thus, in this respect, a minimally invasive surgical system 100 tracks a location of part of a human hand. Based on the tracked location, a system control parameter is generated, e.g., a hand gesture position is selected. The hand gesture position is used to control the user interface of a minimally invasive surgical system 100, for example, the user interface on the screen of the surgeon's console 185. [00101] The control of the user interface is only illustrative and is not intended to be limiting. A hand gesture can be used to perform any of the mode changes in a known minimally invasive surgical system, for example, master clutch, camera control, camera focus, manipulator arm exchange, etc. [00102] If the hand gesture position recognition process determines that the position is the hand gesture position for a hand gesture path, a system event is not injected by the hand tracking controller 130 based on recognition position. Instead, a process of recognizing the trajectory of the gesture is initiated. [00103] In this example, position of the hand gesture 300B (figure 3B) is the position used to generate a trajectory of the hand gesture. Figures 4A and 4B are two-dimensional examples of hand gesture trajectories 400A and 400B that are made using the hand gesture position 300B. Figure 4C shows two other two-dimensional examples of hand gesture trajectories that can be used. [00104] In one aspect, the process of recognizing the trajectory of the hand gesture uses a Markov Hidden Model A. To generate the probability distribution for the Markov Hidden Model A, a training database is required. Before obtaining the training database, a set of hand gesture trajectories is specified. In one aspect, the sixteen trajectories of the hand gesture of figure 4C are selected. [00105] In one aspect, several test objects are selected to make each of the trajectories of the hand gesture. In one example, each test object performed each trajectory a predetermined number of times. The position and orientation data for each object for each trajectory performed were saved in the training database. In one aspect, as explained more fully below, the database was used to train a discrete Left-Right Markov Hidden Model using an iterative Baum-Welch method. [00106] When a surgeon makes a trajectory, the data is converted into an observation sequence O by the hand tracking controller 130. With observation sequence O and the Markov Hidden Model A, the hand tracking controller 130 determines which trajectory of the hand gesture corresponds to the sequence of the observed symbol. In one aspect, the hand tracking controller 130 uses the forward recursion algorithm with the Markov Hidden Model A to generate the total probability of the observed symbol sequence. The trajectory of the hand gesture with the highest probability is selected if that probability is greater than the minimum probability limit. If the highest probability is less than the minimum probability limit, no trajectory of the hand gesture is selected, and the process ends. [00107] The trajectory of the selected hand gesture is mapped to a system event. The hand tracking controller 130 injects the system event to the system controller 140. [00108] System controller 140 processes the system event and issues a system command. For example, if the selected hand gesture trajectory is mapped to an event to change the lighting level at the surgical site, the system controller 140 sends a system event to a controller in an illuminator to change the lighting level. Presence Detection by Hand Tracking [00109] In one aspect, as indicated above, the hand positions of the surgeon 291R, 291L (figure 6A) are tracked to determine whether teleoperation of the minimally invasive surgical system 100 is allowed and in some ways to display a user interface for the surgeon. Again, hand tracking controller 130 tracks at least part of a hand from the surgeon 180B (figure 6A). Hand tracking controller 130 generates a location of a master tool handle, for example, a master tool handle 621 (figure 6B), which represents the handles of the master tool 621L, 621R (figure 6A), and a location of the part of the hand. Hand tracking controller 130 maps the two locations within a common coordinate structure and then determines the distance between the two locations in the common coordinate structure. Distance is a control parameter for a minimally invasive surgical system that is based on the tracked location of the surgeon's hand. [00110] If the distance is less than a safe threshold, that is, less than a maximum allowed separation between the hand part and the handle of the master tool, the teleoperation of the minimally invasive surgical system 100 is allowed, and of another In this way, teleoperation is prevented. Likewise, in the aspect that uses presence detection to control the display of a user interface, if the distance is less than a safe threshold, that is, less than a maximum allowed separation between the hand part and the tool handle master, display of a user interface on a screen of the minimally invasive surgical system 100 is prevented, and otherwise, display of the user interface is allowed. [00111] In this way, the distance is used to control the teleoperation of the minimally invasive surgical system 100. Specifically, the hand tracking controller 130 sends a system event to the system controller 140 indicating whether teleoperation is allowed. In response to the system event, system controller 140 configures system 100 to allow or prevent teleoperation. Hand Location Tracking Technologies [00112] Before considering the various aspects of hand tracking described in greater detail, an example of a tracking technology is described. This example is illustrative only and in view of the following description, any tracking technology that provides the necessary hand or finger location information can be used. [00113] In one aspect, pulsed DC electromagnetic tracking is used with sensors mounted on two fingers of one hand, for example, the thumb and index finger, as illustrated in figures 2A to 2D and figure 7. Each sensor measures six degrees of freedom and in one aspect it has a size of eight millimeters by two millimeters by a millimeter and a half (8 mm x 2 mm x 1.5 mm). The tracking system has a right-handed hemispheric working space of 0.8 m and a position perception resolution of 0.5 mm and 0.1 degrees. The refresh rate is 160 Hertz and has a perception latency of four milliseconds. When integrated within a system, additional latency may be incurred due to additional communication and filtering. Effective command latency of up to 30 milliseconds has been found to be acceptable. [00114] In this regard, a tracking system includes an electromagnetic hand tracking controller, sensors for use on the master finger tracking handle, and a hand tracking transmitter. A tracking system suitable for use in an embodiment of this invention is marketed by Ascension Technology Corporation of Burlington, Vermont, USA as a 3D trakSTAR ™ System with Medium Amplitude Transmitter. (trakSTAR ™ is a registered trademark of Ascension Technology Corporation). The transmitter generates pulsed DC magnetic fields for high-precision tracking over medium amplitudes, which are specified as 78 centimeters (31 inches). This system provides dynamic tracking with 240 to 420 updates / second for each sensor. The outputs of the miniaturized passive sensors are not affected by the noise sources of the power lines. A free line of sight between the transmitter and the sensors is not required. There is tracking of all positions and no inertial drift or optical interference. There is high metallic immunity and no distortion of non-magnetic metals. [00115] While an electromagnetic tracking system with finger covers is used here, this is only illustrative and is not intended to be limiting. For example, a pen-like device can be held by the surgeon. The pen-like device is a finger piece with three or more reliable, non-collinear reference markers on the outer surface of the device. Typically, to make at least three reliable reference markers visible from any point of view, more reliable reference markers are used due to self-occlusion. The reliable reference markers are sufficient to determine six degrees of freedom (three of translation and three of rotation) of movement of the finger piece and thus that of the hand holding the pen-like device. The pen-like device also perceives a grip / grip in one aspect. [00116] The pen-like device is viewed by two or more cameras of known parameters to locate the reliable reference markers in three dimensions and to infer the position of three dimensions of the finger piece. Reliable reference markers can be implemented, for example, as 1) retro-reflective spheres with illumination close to the camera; 2) concave or convex semi-spheres with lighting close to the camera or 3) active markers such as an LED (flashing). In one aspect, infrared lighting near the finger piece is used, and filters are used to block the visible spectrum on the camera to minimize disturbance of confusion in the background. [00117] In another aspect, a data glove 501 (figure 5) or clean hand 502 is used, and reliable reference markers 511 are connected to the glove thumb and index finger 501 (and / or other glove fingers) that the surgeon will use and / or directly on the skin of the 502 hand. Again, redundant markers can be used to accommodate self-occlusion. Reliable reference markers can also be placed on other fingers to enable more user interface features through specifically defined hand gestures. [00118] The three-dimensional locations of the reliable reference markers are computed by triangulation of multiple cameras having a common field of view. The three-dimensional locations of the trusted reference markers are used to infer the three-dimensional position (translation and orientation) of the hand and also the size of the handle. [00119] The locations of the markers need to be calibrated before use. For example, the surgeon can show the camera a hand with markers in different positions. The different positions are then used in the calibration. [00120] In yet another aspect, hand tracking without a marker is used. Articulated hand movement can be tracked by using images seen from one or more cameras and processing those images by running computer software. Computer execution software does not need to track every degree of freedom of the hand to be useful. The execution software only needs to track the part related to the two fingers of a hand to be useful for controlling a surgical tool as shown in this document. [00121] Camera-based tracking, the accuracy of measurements depends on the accuracy of the location of the markers in the image; precision of three-dimensional reconstruction due to the geometry of the camera; and redundant data such as more than a minimum number, for example, three, of reliable reference markers, more than the minimum number (one or two) of cameras, and temporary averaging and filtering. [00122] The precision of three-dimensional reconstruction relies heavily on the accuracy of the camera calibration. Some reliable reference markers attached to known locations on the surgeon's console can be used to determine the external parameters (rotation and translation) of multiple cameras with respect to the surgeon's console. This process can be done automatically. Reliable reference markers can be used to calibrate reliable reference markers since such markers are only connected during a calibration process and before the procedure. During the procedure, calibration of trusted reference markers is turned off to avoid confusion with the trusted reference markers used to locate the surgeon's hands. The relative external parameters can also be computed by observing a moving marker in the common field of view of the cameras. [00123] Other tracking technologies that are suitable for use include, but are not limited to, inertial tracking, deep camera tracking, and fiber curve perception. [00124] As used in this document, a sensor element, sometimes called a tracking sensor, can be a sensor for any of the hand tracking technologies described above, for example, a passive electromagnetic sensor, a reliable reference marker, or a sensor for any of the other technologies. Coordinate Systems [00125] Before considering the various processes described above in more detail, an example of a 185B surgeon console (figure 6A) is considered and several coordinate systems are defined for use in the following examples. The 185B surgeon console is an example of the 185B surgeon console. The 185B surgeon console includes a 610 three-dimensional display, sometimes referred to as the 610 display, 620L, 620R master tool handlers with 621L, 621R master tool handles, and a base 630. Master tool holder 621 (figure 6B) is a more detailed diagram of master tool handles 621L, 621R. [00126] Master tool handles 621L, 621R of master tool handlers 620L, 620R are held by the surgeon 180B using the index finger and thumb, so that pointing and holding involves intuitive guidance and pinching movements. Master tool manipulators 620L, 62R in combination with master tool handles 621L, 621R are used to control teleoperated slave surgical instruments, teleoperated endoscopes, etc. in the same way as known master tool manipulators in a known minimally invasive teleoperated surgical system. Also, the coordinates of the position of the master tool handlers 620L, 620R and master tool handles 621L, 621R are known from the kinematics used in the control of slave surgical instruments. [00127] In normal operating view mode, viewfinder 610 displays three-dimensional images of the surgery site 103 from stereoscopic endoscope 112. Viewfinder 610 is positioned on console 185B (figure 6B) close to the surgeon's hands so that the image of the surgery site seen in viewfinder 610 is oriented in such a way that the surgeon feels that he or she is actually looking directly down on the surgery site 103. The surgical instruments in the image appear to be located substantially where the The surgeon's hands are located and oriented substantially as the 180B surgeon would expect based on the position of his hands. However, the 180B surgeon can see neither his hands nor the position or orientation of the handles of the 621L, 621R master tool, while viewing the displayed image of the surgery site on the 610 display. [00128] In one aspect, the master tool manipulators 620L, 620R are moved from directly in front of the surgeon 180B and under the display 610 so that they are positioned on the base 630, and so that they are no longer positioned under the display 610, that is, the manipulators of the master tool are parked out of the way of the hand gesture. This provides an unimpeded volume under the display 610 in which the surgeon 180B can make hand gestures, one or both of the hand gesture positions or hand gesture trajectories. [00129] In the aspect of figure 6A, three coordinate systems are defined with respect to the 185B surgeon's console: a 660 view coordinate system, a world coordinate system 670, and a tracker coordinate system 650. Note that Equivalent coordinate systems are defined for the surgeon 181 (figure 1), so that the mapping described more fully below can be done for tracking data from the master finger tracking handle 170 or the master tool handles 621L, 621R. See, for example, United States Patent Application 12 / 617,937 (filed on November 13, 2009, revealing "Patient Surgeon Interface for Minimally Invasive Teleoperated Surgical Instrument"), which was previously incorporated into this document by reference. [00130] In the 660 vision coordinate system, the 180B surgeon is looking down at the Z VisionZ axis. Y-axis VisionY points up on the monitor screen. X-axis VisãoX points left on the monitor screen. In the world 670 coordinate system, the world Z-axis is a vertical axis. X-axis worldX world and Y-axis world X world are on a plane perpendicular to the Z axis Z world. [00131] Figure 6B is a more detailed illustration of a handle from a master tool 621 and master tool manipulators 620. [00132] Coordinate systems 680, 690 are discussed more fully below with respect to method 1100 of figure 11. Surgical Instrument Control Process by Hand Tracking [00133] Figure 7 is an illustration of sensor 212 mounted on the index finger 292B with a location 713 on the tracking coordinate system 750, and a sensor 211 mounted on the thumb 292A with a location 711 on a tracking coordinate system 750. Sensors 211 and 212 are part of the electromagnetic tracking system described above. The thumb 292A and index finger 292B are examples of fingers on the right hand 291R. As noted earlier, a portion of a human hand includes at least one digit of the hand. As is known to those skilled in the field, the fingers, sometimes called digits or phalanges, of the hand are the thumb (first digit), index finger (second digit, index finger), middle finger (third digit), ring finger (fourth digit), little finger (fifth digit). [00134] In this document, the thumb and index finger are used as two-digit examples of a human hand. This is only illustrative and is not intended to be limiting. For example, the thumb and the middle finger can be used in place of the thumb and the index finger. The description in this document is also directly applicable to the use of the middle finger. Also, the use of the right hand is only illustrative. When similar sensors are used in the left hand, the description in this document is also applicable directly to the left hand. [00135] A cable 741, 742 connects sensors 211, 212 from the tracking handle of the master finger 270 to the hand tracking controller 130. In one aspect, the cable 741, 742 carries position and orientation information from the sensors 211, 212 for hand tracking controller 130. [00136] The use of a cable to transmit the perceived position and guidance data to the hand tracking controller 130 is illustrative only and is not intended to be limiting for this specific aspect. In view of this disclosure, a person skilled in the field can select a mechanism to transmit the perceived position and guidance data from the finger tracking master handle or finger tracking master handles to the hand tracking controller 130 (for example, by using wireless connection). [00137] The cable 741, 742 does not prevent the movement of the tracking handle of the master finger 270. Since the tracking handle of the master finger 270 is mechanically unsubstantiated, each tracking finger of the master finger is effectively free both for movements of position as orientation movements within the workspace reachable by the surgeon and the workspace of the hand tracking transmitter (for example, left - right, up - down, in - out, rotate, tilt, and yaw in a Cartesian coordinates). [00138] In one aspect, as described above, each sensor 211, 212 on the tracking handle of the master finger 270 perceives three degrees of translation and three degrees of rotation, that is, six degrees of freedom. In this way, the data perceived from the two sensors represent twelve degrees of freedom. In another aspect, each sensor 211, 212 on the tracking handle of the master finger 270 perceives three degrees of translation and two degrees of rotation (yaw and tilt), that is, five degrees of freedom. In this way, the data perceived from the two sensors represent ten degrees of freedom. [00139] Using a control point position and control point orientation based on the tracked locations to control a teleoperated slave surgical instrument requires six degrees of freedom (three of translation and three of orientation), as described more fully below. Thus, in aspects where each sensor has five or six degrees of freedom, sensors 211, 21 provide redundant degrees of freedom. As described above and more completely below, redundant degrees of freedom are mapped to parameters used to control aspects of a teleoperated slave surgical instrument other than position and orientation. [00140] Still in an additional aspect, each sensor 211, 212 perceives only three degrees of freedom in translation and thus together represent six degrees of freedom. This is sufficient to control three degrees of translation, rotation and handle closure of a slave surgical instrument that does not include a wrist mechanism. The following description is used to generate the location of the control point using the six degrees of freedom. The orientation of the control point is taken as the orientation of the slave surgical instrument. The handle closing parameter is determined as described below using the location of the control point and the orientation of the control point. The rotation is determined as described above using the relative movement of the thumb and index finger. [00141] In any of the aspects where the sensors perceive six degrees of freedom, or where the sensors perceive five degrees of freedom, the sensor of the index finger 212 generates a signal representing a position of the index finger pmdicator and an orientation of the index finger Indicator in structure of the tracking coordinates 750. The thumb sensor 211 generates a signal representing a position of the thumb position and a thumb orientation R-position in the structure of tracking coordinates 750. In one aspect, the positions of the indicator and the position are taken as aligned with the center of the user's fingernail on the index finger 292B and the center of the user's fingernail on the thumb 292A, respectively. [00142] In this example, the Pindicator and Ppoiegar positions are each represented by a three-by-one vector in the tracking coordinate structure 750. The Pindicator and Ppoiegar positions are in tracking coordinates. [00143] The guidelines Rindicador and R.poiegar na are each represented by a matrix of three by three in the structure of tracking coordinates 750, that is, [00144] A pcp control point position is centered between the 292B index finger and the 292A thumb. The position of the pcp control point in the 790 control point structure, but, is specified in tracker coordinates. The Z-axis of the 790 control point structure extends through the position of the pcp control point in the pointed direction, as described more fully below. [00145] Also, as explained below, the index finger 292B and thumb 292A are mapped to the jaws of a slave surgical instrument, however, the two fingers are more right-handed than the jaws of the instrument. The Y-axis of the 790 control point structure corresponds to the pin used to close the instrument's jaw. Thus, the Y-axis of the control point structure 790 is perpendicular to a vector between the index finger 292B and thumb 292A, as described below. [00146] The position of the pcp control point is represented as a three by one vector in tracker coordinates of the tracking coordinate structure 750. The orientation of the Rcp control point is represented as a three by three matrix in tracker coordinates , what to say, [00147] Figure 8 is a process flow diagram for mapping the location of part of a hand to a handle closure parameter used to control the grip of a slave surgical instrument, for example, one of the slave surgical instruments teleoperates in figure 1. This mapping also maps a temporal change in the location of a parameter for closing a new handle and a corresponding location of the tip of a slave instrument and the speed of movement for that location. [00148] Initially, from the input to process 800, RECEIVES HAND LOCATION DATA, process 810 receives the position of the index finger and orientation (index, R-index) and thumb position and orientation (ppoiegar, R-poiegar ) which in this example are stored as 811 data. The position of the index finger and orientation (Pindicator,, R-index) θPOSITION OF THE thumb and Orientation (ppoiegar, R- Poiegar) are based on data from the tracking system. Process 810 transfers to process 820 LOCATION DATA ON THE MAP FOR CONTROL POINT AND PARAMETER OF THE HANDLE. [00149] Process 820 MAP LOCATION DATA FOR CONTROL POINT AND HANDLE PARAMETER generates a pcp control point position, an RcP control point orientation, and a handle close parameter using gpega using the finger position index and orientation (indicator, R-indicator) and thumb position and orientation (ppoiegar, R-Poiegar). Position of the pcp control point, orientation of the RcP control point, and gpega handle closure parameter are stored as 821 data. [00150] In one aspect, the control point mapping performed in process 820 is defined to emulate the key properties of the placement of the control points of the known master tool manipulators. In this way, the response to the movement of the thumb and index finger will be familiar and intuitive for users of the minimally invasive teleoperated surgical system known with a surgeon console similar to the 180B surgeon console (figure 6A). [00151] Figure 9 is a more detailed process flow diagram for an aspect of process 820 of LOCATION DATA ON THE MAP FOR HANDLING CONTROL POINT AND PARAMETER. First, in process 820, DATA MAP HAND POSITION TO CONTROL POINT, process 910 generates a location of the position of the pcp control point from the position of the indexing index finger, and the position of the pPoiegar thumb, that is, pcp = 0.5 (p Thumb + Indicator) [00152] The position of the pcp control point is the average of the position of the index finger and thumb position of the process 910 DATA MAP HAND POSITION TO THE CONTROL POINT transfers the processing to process 920 GENERATE CONTROL POINT GUIDANCE. [00153] As indicated above, the Z-axis of the control point orientation is aligned in the direction of the indication. In this aspect of the 920 process GENERATE CONTROL POINT ORIENTATION, Rodriguez's axis / angle formula is used to define the direction of indication of the z-axis vector through the middle of the control point as half rotation between the vector of indication of direction of the index finger Zindicator and the vector of the direction indication of the thumb zZ Thumb. From the R.poiegar thumb orientation, the zpoiegaré thumb direction indication vector: ZPolegar = [R-Thumb13 R-Thumb23 R-Thumb33] [00154] Similarly, from the orientation of the index finger R-index, direction of indication of the vector of the index finger is: Zindicator = [R-indicator13 R-indicator 23 R- indicator 33] [00155] The vector ω is a vector perpendicular to the vector indicating the direction of the index finger Zmdicator and the vector indicating the direction of the thumb zpoiegar. The ω vector is defined as the product of the following A vectors: vector indicating the direction of the index finger and the vector indicating the direction of the thumb zpoiegar, that is, w = index X Thumb [00156] Angle 0 is the magnitude of the angle between the direction indicator vector of the Zindicator index finger and the direction thumb vector 2poiegar. The angle θ is defined as, [00157] With ω axis and 0 angle, the direction vector of the Z-axis axis is: A A Zmeio R {ω, θ / 2} * Zindicator [00158] In this way, process 910 has generated position of the control point pcp and the initial part of process 920 has generated the approximate direction of indication of the Z-axis in the control point structure 790. It is possible to proceed to interpolate the orientation fingers of the index finger and thumb to generate vector axes of control point units Xcp and Ycp in a similar manner and then re-orthogonalize them to produce a control point orientation matrix. [00159] However, greater operational dexterity can be achieved from the tracked locations of the fingers by using the following mapping. This mapping uses the relative positions of the index and thumb to effectively rotate and steer the control point as if manipulating a small cardan between the fingers. The remaining 920 process is performed as follows to generate a complete set of vector axes of ortho-normal control point units Xcp cp, and ZCP. [00160] With these vectors, the orientation of the Rep control point is: [00161] Now with processes 910 and 920, process 820 has mapped the positions of the index finger and thumb and orientations (indicator, indicator) (thumb, thumb) p3T3 POSITION dθ COntrOlβ β orientation Rq>). Process 820 must also generate the gpega handle closing parameter. Thus, process 920 GENERATE CONTROL POINT ORIENTATION transfers processing to process 930 GENERATE HANDLING CLOSURE PARAMETER. [00162] In process 930, the closure of the handle is determined by the distances from the position of the indexing index finger and the position of the thumb from the center line defined by the position of the A control point and direction of the Z-axis cp. This allows the closure to be invariable to sliding when the thumb and forefinger are touching. [00163] Thus, the position of the index finger and the position of the thumb are mapped above the Z-axis in the 790 structure. The position of the index-proj is the projection of the position of the index finger on the Z-axis of the 790 structure, and ppoiegar_-proj position is the projection of the position of the ppoiegar thumb on the Z-axis of the 790 structure. [00164] THE PINDER PINDER proj θ 3 POSITION Ppolegar_proj ARE USED to evaluate an examination of the closing distance of the handle ^ val, that is, [00165] In this document, the double parallel lines are the known representation of the two Euclidean norms. Evaluation of the closing distance of the handle dvai is limited by a maximum distance threshold dmax and a minimum distance threshold dmjn. As illustrated in figure 7, a foam pad 210 between the sensors 211, 212 compels the fingers to be within a fixed separation, that is, between a maximum distance threshold dmax and a minimum distance threshold dmin. In addition, a neutral distance corresponds to the separation distance when the two fingers are just touching. [00166] For a particular set of sensors and a connector, a maximum distance threshold dmax, a minimum distance threshold dmin, and the neutral distance of are determined empirically. In one respect three different combinations of sensors and a connector are provided for small, medium and large hands. Each combination has its own maximum distance threshold dmax, minimum distance threshold dmin, and neutral distance as connector length 210 is different in each of the combinations. [00167] Process 930 compares the distance dvai to a threshold of minimum distance dmin. If the comparison finds that the distance dvai is less than the minimum distance threshold dmin, the closing distance of handle d is adjusted to a minimum threshold distance dmin. Otherwise, process 930 compares the distance dv to a maximum distance threshold dmax. If the comparison finds that the distance dvai is greater than the maximum distance threshold dmax, the closing distance of handle d is adjusted to a maximum threshold distance dmax. Otherwise, the closing distance d of the handle is adjusted to the distance dvai. [00168] The test carried out at the distance dvai to determine the closing distance d is summarized according to: [00169] Next in process 930, the gpega handle closing parameter is generated: [00170] Thus, a closing distance d between a maximum distance threshold dmax and the distance from is mapped to a value between zero and one. A handle closing distance d between a minimum distance threshold dmin and the distance of is mapped to a value between minus one and zero. [00171] A value of one for the gpega handle closing parameter is obtained when the index finger 292B and thumb 292A are separated to the maximum extent allowed by connector 210 (figure 2A). A value of zero for the gpega handle closure parameter is obtained when the tip of the index finger 292B and the tip of the thumb 292A are just touching (figure 2C). Values in a range between zero and one control the opening / closing of the jaws of the end manipulator for a slave surgical instrument. A value of minus one for the gpega handle closure parameter is obtained when the index finger 292B and thumb 292 A are touching and connector 210 is fully compressed between index finger 282B and thumb 292A (figure 2D). Values in a range between zero and one control the force of the jaw of the closed jaws of the end manipulator. The connector 210 provides a passive tactile signal for closing the handle. [00172] This example of mapping the handle closing distance d to a value in one of two ranges is illustrative only and is not intended to be limiting. The example is illustrative of mapping the handle closure distance d to a value in a first handle closure parameter range gpega to control the opening / closing of the jaws of a handler on the end of a slave surgical instrument when the distance d of handle closure is greater than the neutral distance. Here "opening / closing" means opening and closing the jaws. The handle closing distance d is mapped to a value in a second range of the handle closing parameter gpega to control the jaw strength of the end manipulator's closed jaws when the handle closing distance d is less than the distance neutral of. [00173] Thus, process 820 has mapped the position of the index finger and orientation (indicator R-index) and thumb position and orientation (ppoiegar), to the position of the control point and orientation (pcp, Rcp) and closing parameter of the takes gpega which is stored as data 821. Process 820 transfers to process 830 MAP FOR WORLD COORDINATES (figure 8). [00174] Process 830 MAPPING FOR WORLD COORDINATES receives 821 data, and maps data to a world coordinate system. (See world coordinate system 670 (figure 6A)). Specifically, position of the control and orientation point (Pcp ,. Rcp) and gpega handle closing parameter mapped to the position and orientation of the control point in the coordinates of the world (pcP_wc Rcp_wc) using a homogeneous transformation of four by four WC Ttcque maps coordinates in the coordinate system of tracker 750 (figure 7B) to coordinates in the world coordinate system 670, for example, where: Wc Rtc Maps an orientation in coordinates of the tc tracker to an orientation in coordinates of the wc world, and wcRtc Translates a position in coordinates of the tc tracker to a position in coordinates of the wc world. [00175] The gpega closing parameter is not changed by this mapping. The data in coordinates of the wc world are stored as 831 data. Process 830 transfers to process 840 MAP FOR EYE COORDINATES. [00176] Process 840 MAP FOR EYE COORDINATES receives the data 831 in world coordinates wc and maps the data 831 to an eye coordinate system (See 660 eye coordinate system (figure 6A)). Specifically, position and orientation of the control point and world coordinates (pcP wc, Rcp wc) and gpegasão close parameter mapped to position and orientation of the control point in ocular coordinates (pcP ec, RcP ec) using a homogeneous transformation of four by four ecTwc that maps coordinates in the world 670 coordinate system (figure 6A) to coordinates in the 660 eye coordinate system, for example. where: ec R wc Maps an orientation in WC world coordinates to an orientation in EC eye coordinates, and ectwc AND a translation from a position in WC world coordinates to a position in EC eye coordinates. [00177] Again, the gpega handle closing parameter is not changed by the mapping. The data in eye coordinates are stored as 841 data. Process 840 transfers to process 850 GENERATE SPEEDS. [00178] In process 800, mapping processes 830 and 840 are described as two different processes just to facilitate the illustration. In one aspect, mapping processes 830 and 840 are combined in such a way that data from the control point in tC tracker coordinates are mapped directly to data in eye co-ordinates ec using a homogeneous four-by-four transformation ec T tc that maps coordinates in a coordinate system of tracker 650 (figure 6A) for coordinates in eye coordinate system 660, for example, [00179] In this regard, the position of the control point in pcp eye coordinates is: and the orientation of the control point in Rcp eye coordinates is: [00180] In some ways, the mapping of world coordinates can be eliminated. In this case, the control point data is mapped directly from the tracking coordinate system into the eye coordinate system without using a world coordinate system. [00181] For teleoperation, position, orientation, and speed are necessary. In this way, the 850 GENERATE SPEED process generates the necessary speeds. Speeds can be generated in several ways. Some implementations, such as inertial and gyroscope sensors, can directly measure differential signals to produce a linear speed and an angular speed of the control point. If speeds cannot be measured directly, process 850 estimates speeds from location. [00182] Speeds can be estimated using finite differences in the eye coordinate system over the sampling interval. For example, the linear velocity Vcp_ec is estimated as: the angular velocity ωcpec is estimated as: [00183] In another aspect of the 850 GENERATE SPEED process, the linear speed of the control point VcP_tc and the angular speed of the control point ωcp_tc are perceived in the tracker coordinates of the tracker 750 coordinate system (figure 7). In this respect, the linear speed VcP_tc of the directly perceived control point and angular speed ωcp_tc of the control point directly perceived are transferred from the tracker coordinate system 750 to the eye coordinate system 660 using an ecRtc. [00184] Specifically, using the rotation mappings as defined above, [00185] Process 850 GENERATE SPEED transfers to process 860 SEND CONTROL COMMAND. Process 860 sends an appropriate system control command to the slave surgical instrument based on the position, orientation, speeds and handle closing parameter stored as data 851 . [00186] In one aspect, processes 810 and 850 are performed by the hand tracking controller 130 (figure 1). Controller 130 runs finger tracking module 135 on processor 131 to perform processes 810 to 850. In this respect, finger tracking module 135 is stored in memory 132. Process 850 sends a system event to the controller of the system 140 which in turn performs process 860. [00187] It is to be appreciated that the hand tracking controller 130 and the system controller 140 can be implemented in practice by a combination of hardware, software that runs on a processor, and unalterable printed program. Also, functions of these controllers, as described in this document, can be performed by a unit, or divided between different components, each of which can be implemented in turn by any combination of hardware, software that runs on a processor, and program unchanged print. When divided between different components, the components can be centralized in one place or distributed through the system 100 for distributed processing purposes. [00188] Hand Gesture Position Process and Gesture Trajectory Control. [00189] Figure 10 is a process diagram of an aspect of a process 1000 of the hand gesture position and control of the hand gesture path of the system 100. In one aspect as described above, a process of recognizing the position of the hand 1050 hand gesture uses a multidimensional Bayesian classifier and a 1060 hand gesture trajectory recognition process uses a discrete Markov Hidden Model A. [00190] As described above, figures 3A to 3D are examples of hand gesture positions. To train the 1050 hand gesture position recognition process, a number of gesture positions are specified. The number of hand gesture positions used is limited by the ability to define unique positions that can be unequivocally identified by the 1050 recognition process, and by the surgeon's ability to reliably remember and reproduce each of the different hand gesture positions. [00191] In addition to defining the positions of the hand gesture, a set of attributes including a large number of resources fi, where i varies from 1 to n, is defined. The number n is the number of resources used. The number of an attribute type is selected so that each of the positions in this set of allowable positions can be precisely identified. In one respect, the number n is six. [00192] The following is an example of a set of attributes with n resources. [00193] The fi feature is the internal product of two Zindicator 292B index finger direction vectors and the zPoiegar.292A thumb direction vector. The f2 feature is the internal product of the 292B Zindicator index finger direction indication and the 292A zpoiegar thumb direction indication. The Í3 feature is the distance from the zpoiegar 292A thumb projected in the direction indication of the zindicator.292B index finger. The f4 feature is the distance from the thumb of the thumb 292A to the axis along the direction indication of the index finger Zindicator.292B. The f5 feature and the Z component of the 292A thumb direction A indication zpoiegar The fn feature is the inner product of the xpoiegar 292A normal thumb vector and Zindicator-292B index finger direction indication. [00194] Before using the 1000 method, it is necessary to develop a database for training hand gesture positions. A number of different users produce each hand gesture position at least once, and the position and orientation data for each hand gesture position for each user is measured using the tracking system. For example, each person in a group of people does each of the permitted hand gesture positions. Positions and orientations of the index finger and thumb (pindicator, Rindicator), (Ppoiegar, Rpoiegar) are saved for each of the hand gesture positions for each person in the group in the training database. [00195] Using the training database, a set of attributes {fi} is generated for each hand gesture position for each user. The set of training resource vectors for each hand gesture position is then used to compute an average Mi y and an M1- covariance. [00196] Thus, the training database is used to obtain an average of the resource vector and covariance for each training gesture. In addition, for each hand gesture position, a Mahalanobis distance ^^ (See discussion below) is generated for each trainer and the maximum Mahalanobis distance for each hand gesture position is saved as a threshold for that gesture position. of hand. [00197] We can also use the Mahalanobis distance measure to verify that all trained gestures are sufficiently different and unambiguous for the given set of attributes used. This can be achieved by testing the Mahalanobis distance from the mean of a resource vector of a given gesture and the mean of the resource vector of all other allowable gesture positions. This test distance must be much greater than the threshold of the maximum training distance used for this given gesture. [00198] As already known to those skilled in the art, a specification of a Hidden Markov Model requires the specification of two parameters of the model N, M and three probability measures A, B, ir. Markov's Hidden Model A is represented as: A = (A, B, π) [00199] The model parameter N is a number of states in the model, and the model parameter M is the number of observation symbols per state. The three probability measures are transition state probability distributions A, gesture B observation probability distribution, and initial state distribution π. [00200] In one aspect for a Hidden Markov Model, the transition probability distribution A is an N x N matrix. The observation probability distribution of gesture B is an N x M matrix, the initial state distribution π is a matrix N x 1. [00201] Given an observation sequence O and a Hidden Markov Model O, the probability of observation sequence O given the Hidden Markov Model A, that is, P (OI is evaluated in process 1000, as described more completely below. [00202] To generate the probability distributions for Markov's Hidden Model A ’, a training database is required. Before obtaining the training database, a set of hand gesture trajectories is specified. [00203] A series of J test objects is selected to make each of the trajectories of the hand gesture. While in Figure 4C, the sixteen trajectories of the hand gesture are presented in a shape projected in two dimensions, the test objects are not compelled when the various trajectories of the hand gesture are performed, which allow some three-dimensional variations to appear. aspect, each object performed each trajectory of the hand gesture k times, this produces j * k training sequences per trajectory of the hand gesture. [00204] In one aspect, a discrete left-right Markov Hidden Model was used. The Markov Hidden Model was chosen so that the P (O |) probability maximized locally using an iterative Baum-Welch method. See, for example, Lawrence R. Rabiner, "A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition" IEEE Procedures, Vol. 77, No. 2, pp. 257-286 (Feb. 1989), which is incorporated into this document by reference as a demonstration of knowledge of Markov's Hidden Models of those versed in the models. In one respect, the iterative method was stopped when the model converged within 0.1 percent for three successive iterations. [00205] The probability of the initial state π was placed so that the model always starts with state one. Transition probability matrix A started with random entries, which were classified in descending order on a queue-by-queue basis. To compel the left-to-right structure, all entries in the lower diagonal of matrix A of transition probabilities were set to zero. In addition, transitions greater than two states were prohibited when placing entries for zero where (i - j)> 2 for all columns i and columns j. The probability transition matrix A was normalized at the end on a row-by-row basis. [00206] The initialization for the observation probability matrix B partitioned the observation sequence uniformly based on the desired number of states. Then, each state can initially observe one or more symbols with a probability based on a local frequency account. This matrix was also normalized on a row-by-row basis. See, for example, N. Liu, R.I.A. Davis, B.C. Lovell, P.J. Kootsookos, "Effect of Initial HMM Choices on Multiple Sequence Training for Gesture Recognition" - International Conference on Information Technology, 5-7, Las Vegas, p. 608-613 (April 2004), which is incorporated into this document by reference as a demonstration of initialization procedures for Hidden Markov Models known to those skilled in the art. A Hidden Markov Model was developed for each of the hand gesture trajectories. [00207] Returning to method 1000, GESTURE MODE ENABLED checking process 1001 determines whether the surgeon has enabled the operation gesture recognition system 100. In one aspect, to enable the gesture recognition mode, the surgeon presses a pedal on the surgeon's console 185 (figure 1 A). If gesture recognition mode is enabled, checking process 1001 transfers it to process 1010 RECEIVING HAND LOCATION DATA, and otherwise return via RETURN 1002. [00208] Process 1010 RECEIVING HAND LOCATION DATA receives position and orientation of the index finger (Pindicator, R- index) β POSITION OF THE thumb and orientation (P-thumb, R-thumb) for the gesture being made by the surgeon. As noted above, index finger position and orientation (R-index indicator) and thumb position and orientation (ppoiegar, R-Poiegar) are based on data from the tracking system. Process 1010 transfers to PROCESS 1011 GENERATING ATTRIBUTES. [00209] In the process 1011 GENERATING ATTRIBUTES, the position of the index finger and orientation (indicator R-index) and position of the thumb and orientation (pPoiegar, Rpoiegar) are used to generate each of the resources fi a fn_o in an observed resource vector thread. Process 1011 GENERATING ATTRIBUTES transfers to process1012 COMPARING ATTRIBUTES WITH KNOWN POSITIONS. [00210] Process 1012 COMPARE ATTRIBUTE WITH KNOWN POSITIONS compares the observed attribute vector fi_o with the set of trained attributes {fi} for each position. This process determines the probability that the observed attribute vector is included within a set of attributes of a training data set {fi} for a particular hand gesture position, that is, it corresponds to the training data set. This can be expressed as ‘° where the set of attributes of the training data set {fi} is of the class H object. (F- IΩ) [00211] In this example, the probability P (f- IΩ) is: [00212] where N is the dimensionality of the attribute vector, for example, n in the example above. [00213] A statistic used to characterize this probability is the Mahalanobis distance d (fs_o), which is defined as: Where The Mahalanobis distance is known to those versed in the area. See, for example, Moghadam, Baback and Pentland, Alex, "Visual Probabilistic Learning for Object Representation" IEEE Transactions on Pattern Analysis and Artificial Intelligence, Vol. 19, No. 7, p. 696 to 710 (July 1997), which is incorporated into this document by reference. [00214] Using the representative vectors and representative values A of covariance * jf1 'is used in a diagonalized form so that the distance d (fi_0) Mahalanobis is: Where The diagonal shape allows the Mahalanobis distance from d (fi_o) to be expressed in terms of the sum: [00215] In this example, this is the expression that is evaluated to determine the Mahalanobis distance d (fi o). Therefore, process 1011 generates a Mahalanobis distance d (fj o). Upon completion, process 1012 transfers to process 1013 SELECT POSITION. [00216] In process 1013 SELECT POSITION, the hand gesture position having the shortest Mahalanobis distance d (fi <>) is selected if the Mahalanobis distance d (fi <>) is less than the maximum Mahalanobis distance in the database training for that hand gesture position. If the Mahalanobis distance d (fi <>) is greater than the maximum Mahalanobis distance in the training database for that hand gesture position, no hand gesture position is selected. Process 1012 SELECT POSITION transfers to process 1014 TEMPORARY FILTER. [00217] Process 1014 TEMPORARY FILTER determines whether the result of process 1013 has provided the same result consecutively a predetermined number of times. If process 1013 has provided the same result for the predetermined number of times, process 1014 TEMPORARY FILTER transfers the verification process 1015 GESTURE POSITION, otherwise it returns. The predetermined number of times is selected in such a way that the 1014 TEMPORARY FILTER process avoids transient oscillations or detections when switching between hand gesture positions. [00218] Verification process 1015 GESTURE POSITION determines whether the position of the selected hand gesture is the position of the hand gesture used in a hand gesture trajectory. If the selected hand gesture position is the hand gesture position used in a hand gesture trajectory, the 1015 MANAGEMENT POSITION verification process transfers processing to process 1020 GENERATE SPEED SEQUENCE, and otherwise transfers the processing for verification process 1016 CHANGE OF POSITION. [00219] The 1016 CHANGE POSITION verification process determines whether the position of the hand gesture has changed since the last pass using method 1000. If the position of the selected hand gesture is the same as the result of the position of the temporary filtered gesture immediately previous, the 1016 CHANGE OF POSITION verification process returns via RETURN 1003, and otherwise transfers it to the 1030MAPA FOR SYSTEM EVENT process. [00220] Process 1030 MAP FOR SYSTEM EVENT maps the selected hand gesture position to a system event, for example, the system event designated for the hand gesture pose is searched for. Upon finding the system event, process 1030 MAP FOR SYSTEM EVENT transfers processing to process 1031 INJECT SYSTEM EVENT. [00221] In one aspect, process 1031 INJECT SYSTEM EVENT sends the system event to a handler in the system controller 140 (figure 1). In response to the system event, the system controller 140 sends an appropriate system command to the controllers and / or other devices in the system 100. For example, if the hand gesture position is assigned to an event connecting the user interface user, system controller 140 sends a command to display controller 150 to turn on the user interface. The monitor controller of the monitor 150 runs the part of the user interface module 155 on the processor 150 required to power the user interface. [00222] When the position of the hand gesture is the position of the hand gesture used to make a trajectory, the processing in method 1000 transfers from the verification process 1015 POSITION OF THE GESTURE to the process 1020 GENERATE SPEED SEQUENCE. In one aspect, the main attribute used for hand gesture trajectory recognition is a unit velocity vector. The unit velocity vector is invariable for the starting position of the gesture. In addition, a normalized velocity vector accounts for variations in the size or speed of the gesture. Thus, in process 1020, the control point samples are converted into a normalized control point speed sequence, that is, within a sequence of unit velocity vectors. [00223] Upon completing process 1020 GENERATE SPEED SEQUENCE, process 1020 transfers the processing to process 1021 CONVERT SPEED SEQUENCE INTO A SYMBOL SEQUENCE. As noted above, the discrete Markov Hidden Model A requires a sequence of discrete symbols as input. In process 1021, the discrete symbols are generated from the normalized speed sequence of the control point by quantizing the vector. [00224] In one aspect, vector quantization was performed using a modified K-mean cluster with the condition that the process stops when the cluster assignments stop changing. As long as K-mean grouping is used, the process leverages the fact that attributes are unit vectors. In this case, the vectors, which are similar in direction, are grouped. This is done using the internal product between each unit attribute vector and the standard center cluster vectors as the measure of similarity. [00225] The grouping is initialized with random vector assignments of thirty-two grouping and the overall process is repeated many times, where the best grouping result is selected based on a maximum total "within" a grouping cost indicator. Note that in this case, the cost "within" the pool is based on a similarity measure. Each of the resulting groupings is assigned a unique index, which serves as the symbol for the Hidden Markov Model. An input vector is then mapped to its nearest cluster average and the corresponding index of that cluster is used as the symbol. In this way, a sequence of unit velocity vectors can be transferred into a sequence of indices or symbols. [00226] In one aspect, the grouped vectors were assigned a symbol based on a two-dimensional and two-dimensional vector quantization code book. In this way, process 1020 generates a sequence of observed symbols and transfers it to process 1023 GENERATING PROBABILITY OF MANAGEMENT. [00227] In one aspect, to determine which gesture corresponds to the observed sequence of symbols, process 1023 GENERATING PROBABILITY OF THE GESTURE uses the forward recursion algorithm with the Markov Hidden Model to find the probability that each gesture matches the observed symbol sequence. The forward recursion algorithm is described in Rainer, "A tutorial on Hidden Markov Models and Selected Applications in Speech Recognition" which was previously incorporated by reference in this document. Upon completion of process 1023 GENERATING PROBABILITY OF MANAGEMENT, processing transfers to process 1024 SELECT PATH. [00228] In process 1024 SELECT TRAJECTORY, the trajectory of the hand gesture with the highest probability among the permissible models of gesture trajectory of the Markov Hidden Model. This probability must also be greater than a given threshold to be accepted. If the highest probability is not greater than the threshold, no trajectory of the hand gesture is selected. This threshold must be tuned to maximize recognition accuracy while avoiding false recognition. [00229] Upon completion, process 1024 SELECT PATH transfers the processing to the verification process 1025 PATH FOUND. If process 1024 SELECT TRAJECTORY selected a trajectory of the hand gesture, the verification process 1025 PATH FOUND transfers the processing to process 1030 MAP FOR SYSTEM EVENT, and otherwise returns through RETURN 1004 [00230] Process 1030 MAP FOR SYSTEM EVENT maps the trajectory of the selected hand gesture to a system event, for example, the system event designated for the trajectory of the hand gesture is searched for. Upon finding the system event, process 1030 MAP FOR SYSTEM EVENT transfers processing to process 1031 INJECT SYSTEM EVENT. [00231] In one aspect, process 1031 INJECT SYSTEM EVENT sends the system event to an event handler on system controller 140 (figure 1). In response to the system event, the system controller 140 sends an appropriate system command to the appropriate controller (s) or device (s). For example, if the system event is assigned to an action on the user interface, system controller 140 sends a command to controller 150 on the monitor screen to take action on the user interface, for example, changing the view of the surgery site. Presence Detection Process [00232] In yet another aspect, as described above, the tracked position of at least part of the hand of the surgeon 180B is used to determine whether the hand is present in a master handler-executor of the extreme 621. Figure 11 is a diagram process flow of an aspect of a presence detection process 1100 performed, in one aspect, by hand tracking controller 130 in system 100. Process 1100 is performed separately for each of the surgeon's hands in one aspect. [00233] In process 1110 GET COMMON ANGLES, the common angles of the master tool manipulator [00234] 620 (figure 6B) are measured. Process 1110 GET COMMON ANGLES transfers processing to process 1111 GENERATE KINEMATICS FORWARD. [00235] Since the lengths of the various links in the master tool manipulator 620 are known and the position of the base 629 of the master tool manipulator 620 is known, geometric relationships are used to generate the handle location of the master tool 621 in the master system coordinates of the 680 workspace. Thus, the 1111GERINATE KINEMATICS FOR FRONT process generates the pmtm position of the master tool handle 621 in the 680 master workspace coordinate system using the 1110 process angles. processing for process 1112 MAP FOR WORLD COORDINATES. [00236] Process 1112 MAP FOR WORLD COORDINATES place pmtm in the workspace coordinate master system 680 to a position pmtm_wc in the world coordinate system (figure 6A). Specifically, where wcTws is a rigid homogeneous four-by-four transformation, which maps coordinates in the 680 coordinate system of the workspace to coordinates in the world 670 coordinate system. [00237] Upon completion, process 1112 MAP FOR WORLD COORDINATES transfers processing to process 1130 GENERATING SEPARATION OF THE HAND FROM THE EXTREME HANDLER. [00238] Return to process 1120 RECEIVING HAND LOCATION DATA process 1120 RECEIVING HAND LOCATION DATA receives (retrieves) the position of the index finger and orientation (Indicator, R-index) β Thumb position and Orientation (Polling) , R-thumb) θ POSITIONS θ ONENTATIONS of the index finger (index, R-index) and thumb position and orientation (ppoiegar, R-poiegar) are based on data from the tracking system. Process 1120 RECEIVE HAND LOCATION DATA transfers processing to process 1121 GENERATE HAND POSITION. [00239] Process 1121 GENERATE HAND POSITION maps the position of the index finger and orientation (R-index indicator) and thumb position and orientation (ppoiegar, R-poiegar) to a position and control point orientation in the coordinate system tracking as described above and this description is incorporated into this document by reference. Position Position is the position of the control point in tracking coordinates. Process 1121 GENERATE HAND POSITION transfers processing to process 1122 MAP FOR WORLD COORDINATES. [00240] The use of the position of the control point in the presence detection is only illustrative and is not intended to be limiting. In view of this disclosure, presence detection can be done, for example, using the position of the tip of the index finger and using the position of the tip of the thumb, or using only one of these positions. The processes described below are equivalent for each of these various positions associated with a part of a human hand. [00241] Process 1122 MAP FOR WORLD COORDINATES places hand in the tracking coordinate system to a position Pmtm_wc in world coordinate system 670 (figure 6A). Specifically, where it is a rigid homogeneous transformation four by four, which maps coordinates in the tracking coordinate system 650 to coordinates in the world coordinate system 670. [00242] Upon completion, process 1122 MAP FOR WORLD COORDINATES transfers processing to process 1130 GENERATING HAND SEPARATION FROM THE HANDLE OF THE EXTREME. [00243] Process 1130 GENERATING HAND SEPARATION FROM THE EXTREME MA-NIPULATOR generates a dsep separation distance between the Pmtm_wc position in the 670 world coordinate system and positions Pmão_wc in the 670 world coordinate system. dsep separation is: [00244] Upon completion, process 1130 GENERATING HAND SEPARATION FROM EXTREME HANDLER transfers processing to process 1131 SAFE DISTANCE. [00245] The verification process 1131 SAFE DISTANCE compares the separation distance dsep to a safe distance threshold. This threshold should be small enough to be conservative while still allowing the surgeon to change the range or manipulate the far end of the far handler. If the separation distance dsep is less than the safe distance threshold, the verification process1131 SAFE DISTANCE transfers to process 1140 HAND CONNECTED PRESENCE. Conversely, the separation distance dsep is greater than the safe distance threshold, verification process 1131 transfers to process 1150 HAND OFF PRESENCE. [00246] Process 1140 PRESENCE OF THE HAND CONNECTED determines whether system 100 is in teleoperation. If system 100 is in teleoperation, no action is required and teleoperation is allowed to continue, so process 1140 transfers to start process 1100 again. If system 100 is not teleoperating, process 1140 HAND-ON PRESENCE sends a hand present event to the SYSTEM INJECT PROCESS INJECT which in turn sends the hand present event to system controller 140. [00247] Process 1150 PRESENCE OF THE HAND OFF determines whether system 100 is in teleoperation. If system 100 is not teleoperating, no action is required and thus process 1150 transfers to start process 1100 again. If system 100 is in teleoperation, process 1150 HANDS OFF PRESENCE sends a hand event not present to the SYSTEM INJECT PROCESS INJECT which in turn sends the hand event to system controller 140. [00248] System controller 140 determines whether the hand present event or the hand not present event requires any change in the system's operating mode and issues an appropriate command. In one aspect, the system controller 140 enables teleoperation in response to a hand present event, for example, allows teleoperation, and disables teleoperation in response to a hand not present event if a teleoperated minimally invasive surgical instrument is connected to the master tool handle. As is known to those skilled in the field, a teleoperated minimally invasive surgical instrument can be disconnected from a master tool handle. [00249] In other respects, the hand present event and the hand not present event are used by the system controller 140 in combination with other events to determine whether to allow teleoperation. For example, detection of the presence of the surgeon's head can be combined with detection of the presence of the surgeon's hand or hands in determining whether to allow teleoperation. [00250] Likewise, as described above, the present hand event and the non present hand event are used by the system controller 140 to control the monitor screen of a user interface on a minimally invasive surgical system screen. When system controller 140 receives the hand not present event, if the user interface is not powered on, system controller 140 sends a command to controller 150 on the monitor screen to turn on the user interface. The monitor controller of the monitor 150 runs the part of the user interface module 155 on the processor 150 required to power the user interface. When system controller 140 receives the hand present event, if the user interface is on, system controller 140 sends a command to controller 150 on the monitor screen to shut down the user interface. The monitor controller of the monitor 150 runs the part of the user interface module 155 needed to shut down the user interface. [00251] The hand present event and the hand not present event can be used by the system controller 140 in combination with other events to determine whether to display the user interface. Thus, control of the user interface screen and teleoperation control are examples of system mode control using presence detection and are not intended to be limiting for these two specific system control modes. [00252] For example, presence detection can be used to control a substitute visual such as those described more fully below. Also the combinations of the different modes, for example, teleoperation and substitute visual display, can be controlled by the system controller 140 based on the hand present and hand not present event. [00253] Also, hand presence detection is useful in eliminating the dual purpose of the 621L, 621R master tool handles, for example, pressing a pedal and then using the 621L, 621R master tool handles to control an interface. that is displayed on the 185B surgeon's console. When master tool handles are dual-purpose, for example, used to control both a surgical instrument and a user interface, the surgeon usually has to press a pedal to switch to user interface operating mode. If, for some reason, the surgeon fails to press the pedal, however, he believes that the system has switched to user interface operating mode, the movement of the handle of the master tool may result in an unwanted movement of the surgical instrument. The 1100 presence detection process is used to avoid this problem and eliminate the dual purpose of the master tool handles. [00254] With the 1100 presence detection process, in an example, when the far / off hand presence event is received by controller 140, system controller 140 sends a system command to block the 620L, 620R master tool handlers (figure 6A) in place, and sends a system command to the monitor screen controller 150 to display the user interface on the surgeon console monitor 185B. The movement of the surgeon's hand is tracked and is used to control elements in the user interface, for example, moving a slide switch, changing the screen, etc. As noted above, the control point is mapped within the eye coordinate structure and so it can be associated with the location of an element in the user interface. The movement of the control points is used to manipulate this element. This is achieved without the surgeon having to activate a pedal and is done in such a way that the surgeon cannot unintentionally move a surgical instrument. Thus, this eliminates the associated problems using the master tool handles to control both the surgical instrument and the user interface. [00255] In the example above, the world coordinate structure is an example of a common coordinate structure. The use of the world coordinate structure as the common coordinate structure is illustrative only and is not intended to be limiting. Master Finger Tracking Handle [00256] Figure 12 is an illustration of an example of a 1270 finger tracking master handle. The 1270 finger tracking master handle is an example of the 170, 270 finger tracking master handle. [00257] The finger tracking master handle 1270 includes a compressible body 1210 and two finger circuits 1220, 1230. The compressible body 1210 has a first end 1213 and a second end 1214. [00258] The compressible body 1210 has an external outer surface. The outer outer surface includes a first portion 1216 and a second portion 1217. The first portion 1216, for example, an upper portion, extends between the first end 1213 and the second end 1214. The second portion 1217, for example, the bottom, extends between the first end 1213 and the second end 1214. The second portion 1217, for example, the bottom portion extends between the first end 1213 and the second end 1214. The second portion 1217 is opposite and removed from the first portion 1216. [00259] In one aspect, the outer outer surface is a surface of a fabric covering. The fabric is suitable for use in an operating room. The fabric lining involves compressible foam. The foam is selected to provide resistance to compression, and expansion as the compression is released. In one aspect, several foam strips were included within the fabric covering. The foam should also be able to be folded so that the first 1216 portion is positioned between the first and second fingers of a human hand as the tip of the first finger moves towards the tip of the second finger. [00260] Body section 1215 has a length L between the loop of the finger 1220 and the loop of the finger 1230. As explained above, the length L is selected to limit the separation between a first finger on circuit 1220 and a second finger on circuit 1230 (See figure 2A). [00261] In one aspect, the body section 1215 has a thickness T. As illustrated in figure 2C, the thickness T is selected so that when the tracking handle of master finger 1270 is configured such that region 1236 on the second portion 1217 of the outer outer surface adjacent to end 1214 and region 1236 in the second portion 1217 adjacent to end 1213 are just touching, the second portion 1217 next to extension L is not in complete contact with itself. [00262] The first loop of the finger 1220 is affixed to the compressible body 1210 adjacent to the first end 1213. The circuit 1220 extends around a region 1225 of the first portion 1216 of the external outer surface of the compressible body 1210. When placing from the first loop of finger 1220 on the first finger of the human hand, region 1225 contacts the first finger, for example, a first part of the first portion 1216 of the outer outer surface contacts the thumb. [00263] In this example, the loop of the finger 1220 has two ends, the first end of fabric 1221A and a second end of fabric 1221B. End 1221 A and end 1221B are ends of a strip of fabric that is affixed to body 1210. A piece of lace fabric 1222B is attached to an inner surface of end 1221B, and a piece of hook fabric 1222A is attached is tied to an outer surface of end 1221 A. An example of hook fabric and lace fabric is a nylon fastening tape consisting of two strips of nylon fabric, one having tiny strands tied and the other a rough surface. The two strips form a strong bond when pressed together. An example of a commercially available fixing tape is the VELCRO® fixing tape. (VELCRO® is a registered trademark of Velcro Industries B.V.). [00264] The second loop of the finger 1230 is attached to the compressible body 1210 adjacent to the second end 1214. Circuit 1230 extends around a region 1235 of the first portion 1216 of the outer outer surface of compressible body 1210. When placing the second loop of finger 1230 on the second finger of the human hand, region 1235 contacts the second finger, for example, a second part of the first portion 1216 of the outer outer surface contacts the index finger. Second part 1235 of the first part is opposite and removed from the first part 1225 of the first part. [00265] In this example, the loop of the finger 1230 also has two ends, a first end of fabric 1231A and a second end of fabric 1231B. End 1231 A and end 1231B are ends of a strip of fabric that is affixed to body 1210. A piece of lace fabric 1232B is attached to an inner surface of end 1231B, and a piece of hook fabric 1232A is attached is attached to an outer surface of end 1231 A. [00266] A first location tracking sensor 1211 is attached to the first loop of the finger 1220. A second location tracking sensor 1212 is attached to the second loop of the finger 1230. The location tracking sensors can be any of the sensor elements described above. In one example, location tracking sensors 1211, 1212 are passive electromagnetic sensors. Substitute Visual System [00267] In one aspect, the hand tracking control system is used to control any of a large number of visual surrogates that can be used by one surgeon to inspect another surgeon. For example, when surgeon 180 (figure 1A) is being supervised by surgeon 181 using the master finger tracking handle 170, surgeon 181 uses the master finger tracking handle 170 to control a replacement look for a surgical instrument, while the surgeon 180 uses the handle of the master tool to control a teleoperated slave surgical instrument. [00268] Alternatively, the surgeon 181 can draw the distance, or can control a virtual hand on the monitor screen. Also, surgeon 181 can demonstrate how to manipulate the master tool handle on the surgeon's console by manipulating a virtual image of master tool handle 621 that is displayed on the screen. These examples of substitute visuals are illustrative only and are not intended to be limiting. [00269] In addition, the use of the finger tracking master grip 170 although not on a surgeon's console is illustrative as well and is not intended to be limiting. For example, with the presence detection system described above, a surgeon on the surgeon's console could move a hand from a master tool handle, and then use that hand to inspect another surgeon as the hand is tracked by the tracking system. of hand. [00270] To facilitate inspection, a substitute visual module (not shown) is processed as part of a one-way vision processing subsystem. In this regard, the execution module receives the position and orientation of the control point from the hand of the supervisor, produces stereoscopic images, which are composed with the images from the endoscopic camera in real time and displayed in any combination of the 185 surgeon console, the assistant monitor screen, and the surgeon 187 interface display on the patient side. [00271] When the surgeon 181 initiates the inspection by taking a predefined action, for example, a hand gesture position, a substitute visual system circuit is activated, for example, the substitute visual module is executed in a processor module . The specific action, for example, hand gesture position, used as the predefined action is not essential as the system controller 140 (figure 1) is configured to recognize this action. [00272] In one aspect, the replacement visual is a virtual ghost instrument 1311 (figure 13) controlled by a finger tracking master grip 170, while the teleoperated slave surgical instrument 1310 is controlled by one of the master tool manipulators on the console. surgeon 185. Surgeon 181 sees both instruments 1310 and 1311 on the stereoscopic screen on the surgeon's console 185. The use of virtual ghost instrument 1311 as a substitute visual is illustrative only and is not intended to be limited to this particular image. In view of this disclosure, other images can be used for the substitute visual, which facilitates the differentiation between the image representing the substitute visual and the image of the real manipulator at the end of the teleoperated slave surgical instrument. [00273] The phantom virtual instrument 1311 appears similar to the real instrument 1310, except that the phantom virtual instrument 1311 is displayed in a way that clearly distinguishes the phantom virtual instrument 1311 from the real instrument (for example, a transparent or translucent image, an image unmistakably colorful, etc.). The control and operation of the 1311 phantom virtual instrument is the same as described above for a real teleoperated surgical instrument. Thus, surgeon 181 can manipulate the phantom virtual instrument 1311 using the finger tracking master grip 170 to demonstrate the correct use of the teleoperated slave surgical instrument 1310. Surgeon 180 can imitate the movement of the phantom virtual instrument 1311 with the 1310 instrument. [00274] Phantom virtual instruments are described more fully in United States Patent Application Publication No. 2009/0192523 Al (filed March 31, 2009, revealing "Synthetic Representation of a Surgical Instrument"), which is fully incorporated in this document by reference. See also United States Patent Application No. 12 / 485,503 (filed June 16, 2009, revealing "Virtual Measurement Tool for Minimally Invasive Surgery") United States Patent Application No. 12 / 485,545 (filed at June 16, 2009, revealing "Virtual Measurement Tool for Minimally Invasive Surgery"); United States Patent Application Publication No. US 2009/0036902 Al (filed on August 11, 2008, revealing "Interactive User Interfaces for Minimally Invasive Robotic Surgical Systems"), United States Patent Application Publication N ° US 2007/0167702 Al (presented on December 30, 2005, revealing "Medical Robotic System Providing" telestration "(illustration technique in remote surgery) in Three Dimensions"); United States Patent Application Publication No. US 2007/0156017 Al (filed December 30, 2005, revealing "Stereoscopic Telestration for Robotic Surgery") and United States Patent Application Publication No. US 2010/0164950 Al ( presented on May 13, 2009; revealing "Efficient 3-D Telestration for Local Robotic Surveillance"), each of which is incorporated by reference in this document. [00275] In another aspect, the substitute look is a pair of virtual hands 1410, 1411 (figure 14) controlled by a finger tracking master grip 170, and a second finger tracking master grip, which is not visible on the figure 1. Teleoperated slave surgical instruments 1420, 1421 are controlled by the master tool manipulators on the surgeon console 185. Surgeon 181 sees 1400 video images on visual display device 187, and surgeon 18a also sees video image 1400 on stereoscopic screen on the surgeon's console 185. Virtual hands 1410, 1411 are shown in a way that they are clearly distinguishable from other objects in the 1400 video image. [00276] The opening and closing of the thumb and index finger of a virtual hand is controlled using the gpega handle closure parameter which was described above. The position and orientation of the virtual hand is controlled by the position and orientation of the control point, as described above, which are mapped within the eye coordinate space, also as described also above. [00277] Thus, as the surgeon 181 moves the surgeon's right hand in three dimensions, the virtual hand 1411 follows the movement in the video image 1400. The surgeon 181 can rotate the virtual hand 1411 to instruct the surgeon 180 to turn the instrument teleoperated slave surgical 1421. Surgeon 181 can move virtual hand 1410 to a particular location and then use thumb and index finger movement to instruct surgeon 180 to move the teleoperated slave surgical instrument 1420 to that location and to capture tissue. When surgeon 180 captures tissue with instrument 1420, surgeon 181 can use virtual hand 1410 to instruct surgeon 180 how to move the tissue. This occurs in real time and virtual hands 1410, 1411 are superimposed on the stereoscopic image of the endoscope. However, substitute visuals can also be used in a monoscopic system. [00278] In another aspect, the surgeon 181 changes the display mode using a hand gesture position so that the visual substitutes are a phantom 1510 virtual instrument and a "telestration" virtual device (remote surgery illustration technique) 1511, which are shown in video image 1500 (figure 15). The telestration device 1511 is controlled by the finger tracking master handle 170, while a second finger tracking master handle, which is not visible in figure 1, controls the phantom virtual instrument 1511. [00279] The teleoperated slave surgical instruments 1520, 1521 are controlled by the master tool manipulators on the surgeon's console 185. The surgeon 181 sees the video image 1500 on the visual display device 187, and the surgeon 180 also sees the video image 1500 on the stereoscopic screen on the surgeon's console 185. The virtual telestration device 1511 and the phantom virtual instrument 1411 are displayed in a way that clearly distinguishes them from the other objects in the 1500 video image. [00280] To superimpose graphs and indications with a virtual "telestration" device (remote surgery illustration technique) 1511, the surgeon 181 puts his thumb and index finger together as if holding an imaginary pen and then moves his right hand with your thumb and index finger in this position to overlay images on the displayed video image. In video image 1500, surgeon 181 has thus positioned his thumb and index finger and made the 1512 mark to illustrate where the tissue should be cut using the 1521 surgical instrument. After the 1512 mark was made, the 1810 surgeon separated the thumb and index finger and moved the virtual "telestration" device (illustration technique in remote surgery) 1511 to the position shown in video image 1500. [00281] The dialing capability of the virtual "telestration" device (illustration technique in remote surgery) 1511 is controlled using the gpega closure parameter which was described above. As noted above, when the thumb and index finger are just touching, the gpega handle closure parameter is mapped to an initial value in a second range, and so when the gpega closure parameter is in the second range, the "telestration" "(remote surgery illustration technique) is enabled for the" telestration "device (remote surgery illustration technique) 1511. The position and orientation of the control point after being mapped to the eye coordinate system are used to control the movement of the virtual device of "telestration" (illustration technique in remote surgery) 1511. [00282] The above description and attached drawings that illustrate aspects and incorporations of the present inventions should not be taken as limitations - the embodiments define the protected inventions. Various mechanical, constitutive, structural, electrical and operational changes can be made without departing from the spirit and scope of this description and the embodiments. In some instances, well-known circuits, structures and techniques have not been shown or described in detail to avoid darkening the invention. [00283] Furthermore, the terminology of this description is not intended to limit the invention. For example, spatially relative terms - such as "below", "below", "lower", "above", "upper", "near", "distant", and the like - can be used to describe the relationship of an element or attribute with another element or attribute as illustrated in the figures. These spatially relative terms are designed to encompass different positions (that is, locations) and orientations (that is, rotational settings) of the device in use or operation in addition to the position and orientation shown in the figures. For example, if the device in the figures is turned, elements described as "below" or "below" other elements or attributes would then be "above" or "on" the other elements or attributes. Thus, the exemplary term "below" can encompass both positions and orientations from above and below. The device can be oriented in another way (rotated 90 degrees or in other orientations) and spatially related descriptors used in this document interpreted accordingly. Likewise, descriptions of movement along and around multiple axes include several special device positions and orientations. [00284] The singular form "um" "uma", and "o / a" are designed to also include plural forms, unless the context indicates otherwise. The terms "comprises", "comprising", "includes", and the like specify the presence of declared attributes, steps, operations, process elements, and / or components, however, do not prevent the presence or addition of one or more other attributes , steps, operations, process elements, components and / or groups. Components described as connected can be directly connected electrically or mechanically, or they can be connected indirectly via one or more intermediate components. [00285] Memory refers to a volatile memory, a non-volatile memory, or any combination of the two. A processor is connected to a memory containing instructions executed by the processor. This can be accomplished within a computer system, or alternatively by connecting to another computer using a modem and analog lines, or digital interfaces and a digital carrier line. [00286] In this document, a computer program product includes a means configured to store computer-readable code needed for any or any combination of the processes described with respect to hand tracking or in which computer-readable code for anyone or any combination of processes described regarding hand tracking is stored. Some examples of computer program products are CD-ROM disks, DVD disks, flash memory, ROM cards, floppy disks, magnetic tapes, computer hard disks, servers on a network, and signals transmitted over a network representing program code readable by computer. A non-transitory tangible computer program product includes a non-transitory tangible medium configured to store computer-readable instructions for any or any combination of the processes described with respect to the various controllers or in which computer-readable instructions for any or any combination of processes described regarding the various controllers are stored. Non-transitory tangible computer program products are CD-ROM discs, DVD discs, flash memory, ROM cards, floppy disks, magnetic tapes, computer hard drives, and other non-transitory physical storage media. [00287] In view of this disclosure, instructions used in any, or any combination of processes described with respect to hand tracking can be implemented in a wide variety of computer system configurations using an operating system and computer programming language. of interest to the user. [00288] The use of different processors and memories in figure 1 is only illustrative and is not intended to be limiting. In some ways, a simple hardware processor can be used and in other ways multiple processors can be used. [00289] Also, for each of the illustrations, the different processes were distributed between a hand tracking controller and a system controller. This is also illustrative and is not intended to be limiting. The various processes can be distributed through controllers or consolidated into one controller without changing the principles of operation in the hand tracking process. [00290] All examples and illustrative references are not limiting and should not be used to limit the embodiments for specific implementations and incorporations described in this document and their equivalences. Headings are for formatting only and should not be used to limit the subject in any way, because the text below a heading can include cross-reference or apply to the text under one or more headings. Finally, in view of this disclosure, particular attributes described with respect to one aspect or embodiment can be applied to other revealed aspects or embodiments of the invention, even if not specifically shown in the drawings or described in the text.
权利要求:
Claims (18) [0001] 1. Minimally invasive surgical system (100), characterized by the fact that it comprises a master finger tracking device (1270), further comprising a first finger loop (1220); a first sensor element (1211) attached to the first finger loop (1220); a second finger loop (1230); a second sensor element (1212) attached to the second finger loop (1230); and a compressible body (1210) which compresses a first end (1213) and a second end (1214); the second end (1214) being opposite and removed from the first end (1213); the first loop of the finger (1220) is attached adjacent to the first end (1213); the second loop of the finger (1230) is attached adjacent to the second end (1214); and when placing the first finger loop (1220) in a first digit (292B) and the second finger loop (1230) in a second digit (292A), the compressible body (1210) is positioned between the first and second digits (292B, 292A) so that the compressible body (1210) provides resistance to the movement of the first digit (292B) towards the second digit (292A). [0002] 2. System according to claim 1, characterized by the fact that the compressible body (1270) further comprises an external outer surface including a first portion (1216) extending between the first and second ends (1213, 1214), and a second portion (1217), opposite and removed from the first portion (1216), extending between the first and second ends (1213, 1214). [0003] 3. System according to claim 2, characterized by the fact that the first finger loop (1220) is attached to the compressible body (1210) adjacent to the first end (1213) and extending to the around the first portion (1216) of the outer outer surface, when placing the first loop of the finger (1220) on a first digit (292B) of a human hand, a first part of the first portion (1216) of the outer outer surface contacts the first digit (292B), the second finger loop (1230) is attached to the compressible body (1210) adjacent to the second end (1217) and extending around the first portion (1216) of the outer outer surface, and when of placing the second finger loop (1230) on a second digit (292A) of the human hand, a second part of the first portion (1216) of the outer outer surface contacts the second digit (292A). [0004] 4. System according to claim 1, characterized in that a length (L) of the compressible body (1210) is selected to limit a separation between the first finger loop (1220) and the second finger loop (1230 ) after placing the first finger loop (1220) in the first digit (292B) and the second finger loop (1230) in the second digit (292A). [0005] 5. System, according to claim 1, characterized by the fact that a thickness (T) of the compressible body (1210) is selected so that when a tip of the first digit (292B) of a human hand only touches one end of the second digit (292A) of the human hand, the compressible body (1210) is less than fully compressed. [0006] 6. System, according to claim 1, characterized by the fact that it also comprises a teleoperated slave surgical instrument (1310, 1420, 1421, 1520, 1521) having an end manipulator, with the compressible body (1210) being configured to provide tactile feedback corresponding to a grip force from the end handler. [0007] 7. System according to claim 1, characterized in that a portion of each of the first and second loops of the finger (1220, 1230) comprises a hook fabric (1222A, 1232A) and another portion of each between the first and second loops of the finger (1220, 1230) comprises a lace fabric (1222B, 1232B), the lace fabric (1222B, 1232B) being removably attached to the hook fabric (1222A, 1232A) to form a finger circuit (1220, 1230). [0008] 8. System according to claim 1, characterized by the fact that each of the first and second sensor elements comprises a passive electromagnetic sensor. [0009] 9. System according to claim 8, characterized by the fact that each passive electromagnetic tracking sensor has six degrees of freedom. [0010] 10. Method, characterized by the fact that it comprises receiving, in a controller (130, 140), a first location of a sensor (1211) mounted on a first digit (292A) of a human hand and a second location of another sensor (1212) mounted on a second digit (292B) of the human hand, each of the first and second locations having N degrees of freedom, where N is an integer; mapping, by the controller (130, 140), of the first location (1211) and the second location (1212) to a control point location, with the control point location having six degrees of freedom, and the six degrees of freedom are less than or equal to 2 * N degrees of freedom; mapping, by the controller (130, 140), of the first location and the second location to a parameter having a single degree of freedom; and control, by the controller (130, 140), of the teleoperation of a slave surgical instrument (1310, 1420, 1421, 1520, 1521) in a minimally invasive surgical system (100) based on the location of the control point and the parameter. [0011] 11. Method, according to claim 10, characterized by the fact that the parameter comprises a closing distance of the handle. [0012] 12. Method according to claim 10, characterized by the fact that the parameter comprises an orientation. [0013] 13. Method according to claim 12, characterized by the fact that the orientation comprises a turn. [0014] 14. Method, according to claim 10, characterized by the fact that N is six. [0015] 15. Method, according to claim 10, characterized by the fact that N is five. [0016] 16. Method, characterized by the fact that it comprises receiving, in a controller (130, 140), a first location of a sensor (1211) mounted on a first digit (292A) of a human hand and a second location of another sensor (1212) mounted on a second digit (292B) of the human hand, each of the first and second locations having three degrees of freedom; mapping, by the controller (130, 140), of the first location and the second location to a position of the control point, with the control point having three degrees of freedom; mapping, by the controller (130, 140), of the first location and the second location to a parameter having a single degree of freedom; and teleoperation control of a slave surgical instrument (1310, 1420, 1421, 1520, 1521) that does not include a wrist mechanism in a minimally invasive surgical system (100) based on the position of the control point and the parameter. [0017] 17. Method, according to claim 16, characterized by the fact that the parameter comprises a closing distance of the handle. [0018] 18. Method, according to claim 16, characterized by the fact that the parameter comprises turning.
类似技术:
公开号 | 公开日 | 专利标题 BR112012011277B1|2020-10-13|master finger tracking device and a method of use in a minimally invasive surgical system US10543050B2|2020-01-28|Method and system for hand presence detection in a minimally invasive surgical system US9901402B2|2018-02-27|Method and apparatus for hand gesture control in a minimally invasive surgical system BR112012011422B1|2020-09-29|MINIMUM INVASIVE SURGICAL SYSTEM BR112012011321B1|2020-10-13|method and system for manual control of a minimally invasive teleoperated auxiliary surgical instrument EP3092968B1|2019-07-24|System for hand presence detection in a minimally invasive surgical system BR112012011323B1|2020-02-04|minimally invasive surgical system
同族专利:
公开号 | 公开日 JP2013510675A|2013-03-28| JP5699158B2|2015-04-08| WO2011060187A1|2011-05-19| CN102596086A|2012-07-18| JP6000387B2|2016-09-28| EP2480156B1|2016-06-29| JP2015107377A|2015-06-11| EP3092969A3|2017-03-01| US20110118753A1|2011-05-19| EP3092969A2|2016-11-16| KR101762631B1|2017-07-28| US8543240B2|2013-09-24| CN102596086B|2016-01-20| BR112012011277A2|2016-04-12| EP2480156A1|2012-08-01| KR20120102647A|2012-09-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JPH0573118A|1991-09-17|1993-03-26|Yaskawa Electric Corp|Robot controller| ES2115776T3|1992-08-14|1998-07-01|British Telecomm|POSITION LOCATION SYSTEM.| DE4306786C1|1993-03-04|1994-02-10|Wolfgang Daum|Hand-type surgical manipulator for areas hard to reach - has distal components actuated by fingers via Bowden cables| US6463361B1|1994-09-22|2002-10-08|Computer Motion, Inc.|Speech interface for an automated endoscopic system| US6436107B1|1996-02-20|2002-08-20|Computer Motion, Inc.|Method and apparatus for performing minimally invasive surgical procedures| US5855583A|1996-02-20|1999-01-05|Computer Motion, Inc.|Method and apparatus for performing minimally invasive cardiac procedures| US5762458A|1996-02-20|1998-06-09|Computer Motion, Inc.|Method and apparatus for performing minimally invasive cardiac procedures| US7699855B2|1996-12-12|2010-04-20|Intuitive Surgical Operations, Inc.|Sterile surgical adaptor| US6110130A|1997-04-21|2000-08-29|Virtual Technologies, Inc.|Exoskeleton device for directly measuring fingertip position and inferring finger joint angle| US6049327A|1997-04-23|2000-04-11|Modern Cartoons, Ltd|System for data management based onhand gestures| WO1998051451A2|1997-05-12|1998-11-19|Virtual Technologies, Inc.|Force-feedback interface device for the hand| US7472047B2|1997-05-12|2008-12-30|Immersion Corporation|System and method for constraining a graphical hand from penetrating simulated graphical objects| US6126373A|1997-12-19|2000-10-03|Fanuc Usa Corporation|Method and apparatus for realtime remote robotics command| US6331181B1|1998-12-08|2001-12-18|Intuitive Surgical, Inc.|Surgical robotic tools, data architecture, and use| US6799065B1|1998-12-08|2004-09-28|Intuitive Surgical, Inc.|Image shifting apparatus and method for a telerobotic system| US6424885B1|1999-04-07|2002-07-23|Intuitive Surgical, Inc.|Camera referenced control in a minimally invasive surgical apparatus| US6809462B2|2000-04-05|2004-10-26|Sri International|Electroactive polymer sensors| US7236618B1|2000-07-07|2007-06-26|Chee-Kong Chui|Virtual surgery system with force feedback| JP2002059380A|2000-08-22|2002-02-26|Olympus Optical Co Ltd|Master-slave device| JP4014792B2|2000-09-29|2007-11-28|株式会社東芝|manipulator| US9002518B2|2003-06-30|2015-04-07|Intuitive Surgical Operations, Inc.|Maximum torque driving of robotic surgical tools in robotic surgical systems| US7410483B2|2003-05-23|2008-08-12|Novare Surgical Systems, Inc.|Hand-actuated device for remote manipulation of a grasping tool| JP3783011B2|2003-10-02|2006-06-07|株式会社日立製作所|Operation input device, remote operation system, and remote operation method| JP2005224528A|2004-02-16|2005-08-25|Olympus Corp|Endoscope| US7976559B2|2004-11-04|2011-07-12|Dynamic Surgical Inventions, Llc|Articulated surgical probe and method for use| JP2006167867A|2004-12-16|2006-06-29|Fuji Heavy Ind Ltd|Remote control device| US20090278915A1|2006-02-08|2009-11-12|Oblong Industries, Inc.|Gesture-Based Control System For Vehicle Interfaces| US7598942B2|2005-02-08|2009-10-06|Oblong Industries, Inc.|System and method for gesture based control system| US8398541B2|2006-06-06|2013-03-19|Intuitive Surgical Operations, Inc.|Interactive user interfaces for robotic minimally invasive surgical systems| US20070167702A1|2005-12-30|2007-07-19|Intuitive Surgical Inc.|Medical robotic system providing three-dimensional telestration| US7907166B2|2005-12-30|2011-03-15|Intuitive Surgical Operations, Inc.|Stereo telestration for robotic surgery| US7845537B2|2006-01-31|2010-12-07|Ethicon Endo-Surgery, Inc.|Surgical instrument having recording capabilities| JP2006137001A|2006-02-06|2006-06-01|Yaskawa Electric Corp|Method for correcting position and posture of held object| US20070225562A1|2006-03-23|2007-09-27|Ethicon Endo-Surgery, Inc.|Articulating endoscopic accessory channel| US20090192523A1|2006-06-29|2009-07-30|Intuitive Surgical, Inc.|Synthetic representation of a surgical instrument| KR101494283B1|2006-06-13|2015-02-23|인튜어티브 서지컬 인코포레이티드|Minimally invasive surgical system| US9696808B2|2006-07-13|2017-07-04|Northrop Grumman Systems Corporation|Hand-gesture recognition method| US20090012533A1|2007-04-23|2009-01-08|Hansen Medical, Inc.|Robotic instrument control system| US20090138025A1|2007-05-04|2009-05-28|Hansen Medical, Inc.|Apparatus systems and methods for forming a working platform of a robotic instrument system by manipulation of components having controllably rigidity| US7843158B2|2008-03-31|2010-11-30|Intuitive Surgical Operations, Inc.|Medical robotic system adapted to inhibit motions resulting in excessive end effector forces| US8803955B2|2008-04-26|2014-08-12|Intuitive Surgical Operations, Inc.|Augmented stereoscopic visualization for a surgical robot using a camera unit with a modified prism| US8830224B2|2008-12-31|2014-09-09|Intuitive Surgical Operations, Inc.|Efficient 3-D telestration for local robotic proctoring| US9492240B2|2009-06-16|2016-11-15|Intuitive Surgical Operations, Inc.|Virtual measurement tool for minimally invasive surgery| US9155592B2|2009-06-16|2015-10-13|Intuitive Surgical Operations, Inc.|Virtual measurement tool for minimally invasive surgery| US8543240B2|2009-11-13|2013-09-24|Intuitive Surgical Operations, Inc.|Master finger tracking device and method of use in a minimally invasive surgical system|JP4291436B2|1998-09-10|2009-07-08|東芝キヤリア株式会社|Refrigeration cycle compressor| US9266239B2|2005-12-27|2016-02-23|Intuitive Surgical Operations, Inc.|Constraint based control in a minimally invasive surgical apparatus| US10532466B2|2008-08-22|2020-01-14|Titan Medical Inc.|Robotic hand controller| US8332072B1|2008-08-22|2012-12-11|Titan Medical Inc.|Robotic hand controller| US8423182B2|2009-03-09|2013-04-16|Intuitive Surgical Operations, Inc.|Adaptable integrated energy control system for electrosurgical tools in robotic surgical systems| US8120301B2|2009-03-09|2012-02-21|Intuitive Surgical Operations, Inc.|Ergonomic surgeon control console in robotic surgical systems| US20110172550A1|2009-07-21|2011-07-14|Michael Scott Martin|Uspa: systems and methods for ems device communication interface| US8521331B2|2009-11-13|2013-08-27|Intuitive Surgical Operations, Inc.|Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument| US8543240B2|2009-11-13|2013-09-24|Intuitive Surgical Operations, Inc.|Master finger tracking device and method of use in a minimally invasive surgical system| EP2642371A1|2010-01-14|2013-09-25|BrainLAB AG|Controlling a surgical navigation system| CN102934142B|2010-04-09|2016-10-05|卓尔医学产品公司|System and method for EMS device communication interface| US8935003B2|2010-09-21|2015-01-13|Intuitive Surgical Operations|Method and system for hand presence detection in a minimally invasive surgical system| US8996173B2|2010-09-21|2015-03-31|Intuitive Surgical Operations, Inc.|Method and apparatus for hand gesture control in a minimally invasive surgical system| EP2637594A4|2010-11-11|2015-05-06|Univ Johns Hopkins|Human-machine collaborative robotic systems| EP2617530B1|2010-11-30|2015-11-18|Olympus Corporation|Master operation input device and master-slave manipulator| JP5800616B2|2011-07-15|2015-10-28|オリンパス株式会社|Manipulator system| JP6081061B2|2011-08-04|2017-02-15|オリンパス株式会社|Surgery support device| JP6000641B2|2011-08-04|2016-10-05|オリンパス株式会社|Manipulator system| US9519341B2|2011-08-04|2016-12-13|Olympus Corporation|Medical manipulator and surgical support apparatus| JP6005950B2|2011-08-04|2016-10-12|オリンパス株式会社|Surgery support apparatus and control method thereof| JP6021484B2|2011-08-04|2016-11-09|オリンパス株式会社|Medical manipulator| WO2013018861A1|2011-08-04|2013-02-07|オリンパス株式会社|Medical manipulator and method for controlling same| JP5931497B2|2011-08-04|2016-06-08|オリンパス株式会社|Surgery support apparatus and assembly method thereof| EP2740433B1|2011-08-04|2016-04-27|Olympus Corporation|Surgical implement and medical treatment manipulator| JP5953058B2|2011-08-04|2016-07-13|オリンパス株式会社|Surgery support device and method for attaching and detaching the same| JP6009840B2|2011-08-04|2016-10-19|オリンパス株式会社|Medical equipment| JP5936914B2|2011-08-04|2016-06-22|オリンパス株式会社|Operation input device and manipulator system including the same| JP5841451B2|2011-08-04|2016-01-13|オリンパス株式会社|Surgical instrument and control method thereof| JP6021353B2|2011-08-04|2016-11-09|オリンパス株式会社|Surgery support device| US20130060166A1|2011-09-01|2013-03-07|The Regents Of The University Of California|Device and method for providing hand rehabilitation and assessment of hand function| US10795448B2|2011-09-29|2020-10-06|Magic Leap, Inc.|Tactile glove for human-computer interaction| US9924907B2|2011-09-30|2018-03-27|Google Technology Holdings LLC|Method and system for identifying location of a touched body part| KR101929451B1|2012-02-03|2018-12-14|삼성전자주식회사|Controlling apparatus and method for robot| US9445876B2|2012-02-27|2016-09-20|Covidien Lp|Glove with sensory elements incorporated therein for controlling at least one surgical instrument| JP5941762B2|2012-06-14|2016-06-29|オリンパス株式会社|Manipulator system| JP6053358B2|2012-07-03|2016-12-27|オリンパス株式会社|Surgery support device| EP2895098A4|2012-09-17|2016-05-25|Intuitive Surgical Operations|Methods and systems for assigning input devices to teleoperated surgical instrument functions| EP2901368A4|2012-09-28|2016-05-25|Zoll Medical Corp|Systems and methods for three-dimensional interaction monitoring in an ems environment| US10864048B2|2012-11-02|2020-12-15|Intuitive Surgical Operations, Inc.|Flux disambiguation for teleoperated surgical systems| US10631939B2|2012-11-02|2020-04-28|Intuitive Surgical Operations, Inc.|Systems and methods for mapping flux supply paths| US9566414B2|2013-03-13|2017-02-14|Hansen Medical, Inc.|Integrated catheter and guide wire controller| US10849702B2|2013-03-15|2020-12-01|Auris Health, Inc.|User input devices for controlling manipulation of guidewires and catheters| US9283046B2|2013-03-15|2016-03-15|Hansen Medical, Inc.|User interface for active drive apparatus with finite range of motion| CN105228550A|2013-04-25|2016-01-06|直观外科手术操作公司|The visual visual field of control inputs of surgical device| US11020016B2|2013-05-30|2021-06-01|Auris Health, Inc.|System and method for displaying anatomy and devices on a movable display| US9211644B1|2013-10-25|2015-12-15|Vecna Technologies, Inc.|System and method for instructing a device| JP6358463B2|2013-11-13|2018-07-18|パナソニックIpマネジメント株式会社|Master device for master-slave device, control method therefor, and master-slave device| EP3243476B1|2014-03-24|2019-11-06|Auris Health, Inc.|Systems and devices for catheter driving instinctiveness| US9211643B1|2014-06-25|2015-12-15|Microsoft Technology Licensing, Llc|Automatic in-situ registration and calibration of robotic arm/sensor/workspace system| US9363640B2|2014-08-05|2016-06-07|Samsung Electronics Co., Ltd.|Electronic system with transformable mode mechanism and method of operation thereof| LT3188645T|2014-09-04|2020-07-10|Memic Innovative Surgery Ltd.|Device and system including mechanical arms| US9811555B2|2014-09-27|2017-11-07|Intel Corporation|Recognition of free-form gestures from orientation tracking of a handheld or wearable device| CN104503576A|2014-12-22|2015-04-08|山东超越数控电子有限公司|Computer operation method based on gesture recognition| DE102015200428B3|2015-01-14|2016-03-17|Kuka Roboter Gmbh|Method for aligning a multi-axis manipulator with an input device| CN104622429B|2015-01-17|2017-06-16|深圳市前海安测信息技术有限公司|Doctor terminal, patient end assisting in diagnosis and treatment equipment and remote diagnosis system and method| WO2016132371A1|2015-02-22|2016-08-25|Technion Research & Development Foundation Limited|Gesture recognition using multi-sensory data| WO2016137527A1|2015-02-24|2016-09-01|Sri International|Hyperdexterous system user interface| JP6766062B2|2015-03-17|2020-10-07|インテュイティブ サージカル オペレーションズ, インコーポレイテッド|Systems and methods for on-screen identification of instruments in remote-controlled medical systems| US10600015B2|2015-06-24|2020-03-24|Karl Storz Se & Co. Kg|Context-aware user interface for integrated operating room| KR101697185B1|2015-07-29|2017-01-17|국립암센터|Non-restraint Force-feedback MASTER DEVICE AND SURGICAL ROBOT SYSTEM COMPRISING THE SAME| CA2984729A1|2015-08-11|2017-02-16|Mordehai Sholev|Control unit for a flexible endoscope| EP3310286A1|2015-08-13|2018-04-25|Siemens Healthcare GmbH|Device and method for controlling a system comprising an imaging modality| WO2017031132A1|2015-08-17|2017-02-23|Intuitive Surgical Operations, Inc.|Unground master control devices and methods of use| JP2017077609A|2015-10-21|2017-04-27|ファナック株式会社|Calibration device and calibration method for calibrating mechanism parameter of wrist part of robot| US9925013B2|2016-01-14|2018-03-27|Synaptive MedicalInc.|System and method for configuring positions in a surgical positioning system| US10151606B1|2016-02-24|2018-12-11|Ommo Technologies, Inc.|Tracking position and movement using a magnetic field| US10973592B2|2017-03-09|2021-04-13|Memie Innovative Surgery Ltd.|Control console for surgical device with mechanical arms| US11037464B2|2016-07-21|2021-06-15|Auris Health, Inc.|System with emulator movement tracking for controlling medical devices| BR112019004139A2|2016-10-03|2019-05-28|Verb Surgical Inc|robotic surgery immersive three-dimensional screen| US10099368B2|2016-10-25|2018-10-16|Brandon DelSpina|System for controlling light and for tracking tools in a three-dimensional space| EP3548773A4|2016-11-29|2020-08-05|Virtual Incision Corporation|User controller with user presence detection and related systems and methods| KR20180064821A|2016-12-06|2018-06-15|한화에어로스페이스 주식회사|Apparatus and method for boundary plane setting| WO2018102904A1|2016-12-08|2018-06-14|Synaptive MedicalInc.|Optical-based input for medical devices| EP3554415A4|2016-12-15|2020-08-05|Intuitive Surgical Operations Inc.|Detection of user touch on controller handle| WO2018112227A2|2016-12-15|2018-06-21|Intuitive Surgical Operations, Inc.|Actuated grips for controller| CN107220507B|2017-06-06|2021-04-13|吕煜|Remote medical control device and control method| CN107411822A|2017-06-07|2017-12-01|佛山市蓝瑞欧特信息服务有限公司|Tele-medicine double tool control system and application method| CN107224327A|2017-06-07|2017-10-03|佛山市蓝瑞欧特信息服务有限公司|Single tool control system and application method for tele-medicine| US10678338B2|2017-06-09|2020-06-09|At&T Intellectual Property I, L.P.|Determining and evaluating data representing an action to be performed by a robot| KR101971882B1|2017-08-18|2019-04-24|재단법인 실감교류인체감응솔루션연구단|Finger motion capture interface apparatus based on three-dimensional magnetic sensors| CN107608509A|2017-09-14|2018-01-19|国家电网公司|A kind of VR controllers and inspection helmet| EP3700457A4|2017-10-23|2021-11-17|Intuitive Surgical Operations, Inc.|Systems and methods for presenting augmented reality in a display of a teleoperational system| JP6902208B2|2017-11-14|2021-07-14|オムロン株式会社|Gripping method, gripping system and program| BR112020010627A2|2017-11-30|2020-11-10|Michael Munoz|wireless sensor system enabled with internet of thingsthat allows process control, methods to enable process control and predictive maintenance, manufacturing a passive electromagnetic sensor capsule and manufacturing distribution hardware components equipped with passive electromagnetic sensor| WO2019113391A1|2017-12-08|2019-06-13|Auris Health, Inc.|System and method for medical instrument navigation and targeting| JP2021522894A|2018-05-18|2021-09-02|オーリス ヘルス インコーポレイテッド|Controller for robot-enabled remote control system| US20210251707A1|2018-08-22|2021-08-19|Intuitive Surgical Operations, Inc.|Control switch position sensing across a rotational joint| US10838486B2|2019-02-28|2020-11-17|Microsoft Technology Licensing, Llc|Virtual reality controller| WO2020185218A1|2019-03-12|2020-09-17|Intuitive Surgical Operations, Inc.|Layered functionality for a user input mechanism in a computer-assisted surgical system| KR102225106B1|2019-04-25|2021-03-09|울산대학교 산학협력단|Master apparatus for teleoperation| WO2021041253A1|2019-08-23|2021-03-04|Intuitive Surgical Operations, Inc.|Moveable display unit on track| WO2021041249A1|2019-08-23|2021-03-04|Intuitive Surgical Operations, Inc.|Moveable display system| WO2021071933A1|2019-10-08|2021-04-15|Intuitive Surgical Operations, Inc.|Hand presence sensing at control input device| RU2718568C1|2019-11-25|2020-04-08|Ассистирующие Хирургические Технологии , Лтд|Wrist controller for use in operator's robot-surgery system controller| RU2716353C1|2019-11-25|2020-03-11|Ассистирующие Хирургические Технологии , Лтд|Hand controller for use in robot surgery system operator's controller|
法律状态:
2018-03-27| B15K| Others concerning applications: alteration of classification|Ipc: G06F 3/01 (2006.01), A61B 34/30 (2016.01), A61B 34 | 2019-01-08| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-07-30| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2020-05-19| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2020-10-13| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 11/11/2010, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US12/617,937|2009-11-13| US12/617,937|US8521331B2|2009-11-13|2009-11-13|Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument| US12/887,254|2010-09-21| US12/887,254|US8543240B2|2009-11-13|2010-09-21|Master finger tracking device and method of use in a minimally invasive surgical system| PCT/US2010/056409|WO2011060187A1|2009-11-13|2010-11-11|A master finger tracking device and method of use in a minimally invasive surgical system| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|